博碩士論文 109423028 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator齊秉宏zh_TW
DC.creatorPing-Hung Chien_US
dc.date.accessioned2022-7-21T07:39:07Z
dc.date.available2022-7-21T07:39:07Z
dc.date.issued2022
dc.identifier.urihttp://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=109423028
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract在巨量資料的時代,很多資料因為過於龐大,無法在一次訓練中全部使用,因此必須將資料分開進行處理,然而機器學習以往都是針對當次的資料進行最佳化,會造成過去資料的遺忘,且目前的演算法在高層次的計算上耗時十分可觀,因此我們希望提出時間複雜度更低且維持準確率及遺忘率更好的機器學習演算法。災難性遺忘是一個在增進式學習中十分嚴重的問題,所謂的遺忘就是指當我們學習新的資料時,過去的資料沒辦法很好的維持導致遺忘了以往的資料。為了解決這個問題我們提出了這個新方法來解決問題。利用Auto-encoder可以將照片進行還原的能力,推算出每個任務彼此間的相似度,透過相似度影響模型的更新方向來解決所謂的災難性遺忘問題,並在解決問題的根基上去達到原本應有的準確度。我們將會應用這方法在MNIST Rotation、MNIST Permutations以及CIFAR-100進行驗證,並針對模型細節進行調整。最後我們也會將模型應用在目前世界上常用的工廠實際資料來驗證模型可行性。最後的結果也證明了這個方法可以得到比以往更好的結果,可以更有效的解決遺忘的問題。zh_TW
dc.description.abstractCatastrophic forgetting is a serious problem in incremental learning. Model loses the information of the first task after training the second task. In the era of big data, data may be too large to be used in machine learning. The data need to be processed separately. During training, we need to issue data availability and resource scarcity. We need to make sure when we learn more data more the model learns and remember. Therefore, we would like to propose a machine learning algorithm with lower time complexity and better maintenance of accuracy and forgetting rate to improve the performance on catastrophic forgetting problem. Through our proposed method, we use Auto-encoder′s ability to restore the photos and derive the similarity of each task to each other. The similarity affects the update gradient direction of the model to solve the so-called catastrophic forgetting problem, and to achieve the original accuracy based on the solution. We will implement our approach on MNIST Rotation, MNIST Permutations, CIFAR-100, and a real-world dataset The experimental results proved that this method can get better results than before and can solve the problem of forgetfulness more effectively.en_US
DC.subject機器學習zh_TW
DC.subject持續性學習zh_TW
DC.subject增進式學習zh_TW
DC.subject災難性遺忘zh_TW
DC.subjectMachine Learningen_US
DC.subjectContinual Learningen_US
DC.subjectIncremental Learningen_US
DC.subjectCaatastrophic Forgettingen_US
DC.titleA Novel Auto-encoder Task-based Similarity Continual Learningen_US
dc.language.isoen_USen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明