DC 欄位 |
值 |
語言 |
DC.contributor | 資訊管理學系 | zh_TW |
DC.creator | 陳鈺欣 | zh_TW |
DC.creator | Yu-Hsin Chen | en_US |
dc.date.accessioned | 2021-7-20T07:39:07Z | |
dc.date.available | 2021-7-20T07:39:07Z | |
dc.date.issued | 2021 | |
dc.identifier.uri | http://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=107423066 | |
dc.contributor.department | 資訊管理學系 | zh_TW |
DC.description | 國立中央大學 | zh_TW |
DC.description | National Central University | en_US |
dc.description.abstract | 近年來,深度學習已變得越來越流行,它已被廣泛應用於各個領域並取得了優異的成績,但是深度學習通常在很少的訓練樣本的情況下無法達到預期的結果,並且它應該像人類一樣能夠利用過去的經驗來快速學習新任務,因此持續學習的重要性明顯增加,而主要目標是在不忘記過去所學知識的情況下學習新任務。首先,我們提出了一種名為Gradient Episodic Cache Memory的方法,結合了聚類技術來解決Gradient Episodic Memory的存儲和計算問題。其次,我們在CIFAR-10、CIFAR-100和MNIST Permutations數據集上評估模型,而實驗結果表明,GECM的性能優於其他的連續學習的模型,並且GECM在準確性和效率之間也取得了良好的平衡。 | zh_TW |
dc.description.abstract | In recent years, deep learning has become more and more popular. It has been widely used in various fields and has achieved outstanding results. However, deep learning usually fails to achieve the expected results in the condition of few training samples and it should be able to use past experience to quickly learn new tasks like human beings. Therefore, the importance of continuous learning increases significantly while the main goal is to learn new tasks without forgetting what has been learned in the past. First, we propose our method called Gradient Episodic Cache Memory (GECM), which is based on Gradient Episodic Memory framework and combines clustering techniques to resolve the memory and computation problems of Gradient Episodic Memory. Second, we evaluate our model on CIFAR-10, CIFAR-100, and MNIST permutations datasets. The experimental results show that GECM performs better than other state-of-the-art continual models and GECM has a good balance between accuracy and efficiency. | en_US |
DC.subject | 機器學習 | zh_TW |
DC.subject | 深度學習 | zh_TW |
DC.subject | 連續學習 | zh_TW |
DC.subject | 聚類分析 | zh_TW |
DC.subject | Machine Learning | en_US |
DC.subject | Deep Learning | en_US |
DC.subject | Continual Learning | en_US |
DC.subject | Cluster Analyze | en_US |
DC.title | An Efficient Cluster-Based Continual Learning with Gradient Episodic Cache Memory | en_US |
dc.language.iso | en_US | en_US |
DC.type | 博碩士論文 | zh_TW |
DC.type | thesis | en_US |
DC.publisher | National Central University | en_US |