博碩士論文 107423038 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator劉旻融zh_TW
DC.creatorMing-Rong Liuen_US
dc.date.accessioned2020-7-16T07:39:07Z
dc.date.available2020-7-16T07:39:07Z
dc.date.issued2020
dc.identifier.urihttp://ir.lib.ncu.edu.tw:88/thesis/view_etd.asp?URN=107423038
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract現今深度學習模型若要提升準確率至相當水準,經常動輒數千、數萬筆訓練資料才可達成,若要模型學習過去未見過之其他類別資料,往往需要將模型重新訓練。這些實務上的需求使得元學習和連續學習等領域逐漸受到重視,但元學習雖以良好的模型學習彈性著稱,但因訓練過程的高不穩定性使得效能並不可靠。另一方面,連續學習的高穩定性,降低其可學習的任務數量。因此本篇論文著重透過結合元學習與連續學習兩種小樣本學習上表現卓越的演算法,透過連續學習提升元學習的穩定性,同時也透過元學習改善連續學習的學習彈性。此外,過去在深度學習領域研究中,發現所謂的穩定性-彈性困境,意指為兩種效能表現經常會有取捨關係,無法兼得,然後在本篇研究的實驗結果中,該篇模型可在現今小樣本學習常見之資料集上,同時提高測試準確率和驗證準確率。zh_TW
dc.description.abstractRecently, the importance of few shot learning field has obviously increased, and variety of famous learning methods, like Meta-learning and Continuous learning. These methods proposed to solve few shot learning, which main purpose is both training model with only few amounts of data and maintaining high generalization ability. MAML, which is an elegant and effective Meta-Learning method demonstrates its powerful performance in Omniglot and Mini-Imagenet N-way K-shot classification experiments. However, the recent research points out that the problems of instable performance of MAML and others model′s architecture problems. On the other hand, continuous learning models usually face the issue of catastrophic forgetting when the models not only learn new tasks but keep remembering the knowledge about previous tasks. Therefore, we propose our method, En-MAML, which is based on MAML framework, to combine the flexible adaptation characteristic from meta-learning with the stability performance from continual learning. We evaluate our model on Omniglot and Mini-Imagenet datasets, and follow the N-way K-shot experiment protocol. From our experiment results, our model demonstrates higher accuracy and stability on Omniglot and Mini-Imagenet.en_US
DC.subject深度學習zh_TW
DC.subject機器學習zh_TW
DC.subject元學習zh_TW
DC.subject連續學習zh_TW
DC.titleEnhanced Model Agnostic Meta Learning with Meta Gradient Memoryen_US
dc.language.isoen_USen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明