博碩士論文 107423066 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator陳鈺欣zh_TW
DC.creatorYu-Hsin Chenen_US
dc.date.accessioned2021-7-20T07:39:07Z
dc.date.available2021-7-20T07:39:07Z
dc.date.issued2021
dc.identifier.urihttp://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=107423066
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract近年來,深度學習已變得越來越流行,它已被廣泛應用於各個領域並取得了優異的成績,但是深度學習通常在很少的訓練樣本的情況下無法達到預期的結果,並且它應該像人類一樣能夠利用過去的經驗來快速學習新任務,因此持續學習的重要性明顯增加,而主要目標是在不忘記過去所學知識的情況下學習新任務。首先,我們提出了一種名為Gradient Episodic Cache Memory的方法,結合了聚類技術來解決Gradient Episodic Memory的存儲和計算問題。其次,我們在CIFAR-10、CIFAR-100和MNIST Permutations數據集上評估模型,而實驗結果表明,GECM的性能優於其他的連續學習的模型,並且GECM在準確性和效率之間也取得了良好的平衡。zh_TW
dc.description.abstractIn recent years, deep learning has become more and more popular. It has been widely used in various fields and has achieved outstanding results. However, deep learning usually fails to achieve the expected results in the condition of few training samples and it should be able to use past experience to quickly learn new tasks like human beings. Therefore, the importance of continuous learning increases significantly while the main goal is to learn new tasks without forgetting what has been learned in the past. First, we propose our method called Gradient Episodic Cache Memory (GECM), which is based on Gradient Episodic Memory framework and combines clustering techniques to resolve the memory and computation problems of Gradient Episodic Memory. Second, we evaluate our model on CIFAR-10, CIFAR-100, and MNIST permutations datasets. The experimental results show that GECM performs better than other state-of-the-art continual models and GECM has a good balance between accuracy and efficiency.en_US
DC.subject機器學習zh_TW
DC.subject深度學習zh_TW
DC.subject連續學習zh_TW
DC.subject聚類分析zh_TW
DC.subjectMachine Learningen_US
DC.subjectDeep Learningen_US
DC.subjectContinual Learningen_US
DC.subjectCluster Analyzeen_US
DC.titleAn Efficient Cluster-Based Continual Learning with Gradient Episodic Cache Memoryen_US
dc.language.isoen_USen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明