博碩士論文 111522099 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊工程學系zh_TW
DC.creator黃梓豪zh_TW
DC.creatorZih-Hao Huangen_US
dc.date.accessioned2024-7-15T07:39:07Z
dc.date.available2024-7-15T07:39:07Z
dc.date.issued2024
dc.identifier.urihttp://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=111522099
dc.contributor.department資訊工程學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract近年來,在端到端反向傳遞(End-to-End BackPropagation, BP) 的深度神經網絡結構設計中,通過增加層數來提高模型識別能力成為一個 明顯的趨勢,但也面臨梯度爆炸與消失等BP問題。因此,我們提出了新的解耦式模型: 訊息正則化的監督解耦式學習(Decoupled Supervised Learning with Information Regularization, DeInfoReg),透過解耦式模型來截斷各模塊的梯度,使各模塊的梯度互不影響。 本論文透過設計新的局部損失函數和模型結構設計,來增強模型性能與靈活性。新的模型結構設計使模型擁有自適應推理輸出與動態擴增層與新特徵的新特性。新的局部損失函數透過三種正則化方法來衡量模型輸出嵌入之間的訊息,這些方法包括:使輸出嵌入與真實標籤保持不變性、使批次內的輸出嵌入保持差異性、計算輸出嵌入的共變異性來減少自身的冗餘資訊。這些正則化方法使模型能夠更好地捕捉數據的細微特徵,提高模型的辨識性能。 在後續的實驗中,詳細評估了DeInfoReg 模型的性能,證實了其在多種任務和數據集上的優越表現。實驗結果顯示,DeInfoReg 在處理深層結構下的梯度問題方面具有顯著優勢,並且在不同雜訊標籤比例下的抗噪能力也優於傳統BP模型。此外,我們還探討了模型在自適應推理輸出和動態擴增層與新特徵的應用潛力,並提出了未來改進的方向,以進一步提升模型的實用性和泛化能力。這些結果表明DeInfoReg在深度學習領域具有廣泛的應用前景和強大的拓展能力。zh_TW
dc.description.abstractIncreasing the number of layers to enhance model capabilities has become a clear trend in designing deep neural networks. However, this approach faces various optimization issues, such as vanishing or exploding gradients. We propose a new model, Decoupled Supervised Learning with Information Regularization (DeInfoReg), that decouples the gradients of different blocks to ensure that the gradients of different blocks do not interfere. DeInfoReg enhances model performance and flexibility by designing new Local Loss and model structures. The new model structure endows the model with an Adaptive Inference Path, Dynamic Expanded Layers, and Dynamic Extended Layers with new features. The Local Loss function measures the information between model output embedding through three regularization methods. Those methods include: ensuring the invariance of the output embedding with true labels, maintaining the variance of output embedding within batch size, and using the covariance to reduce redundancy in the output embedding. This method enables the model to capture features in the data better, thus improving performance. We evaluate the performance of DeInfoReg through various tasks and datasets. The experimental results demonstrate that DeInfoReg signif cantly addresses gradient issues in deep neural networks and shows superior noise resistance under different proportions of label noise compared to traditional backpropagation. Additionally, we explore the potential applications of the model in Adaptive Inference Paths and Dynamically Expanded Layers with new features. The findings indicate that DeInfoReg has broad application prospects and robust expansion capabilities in deep neural networks. Finally, we discuss future improvements to enhance the model’s practicality and generalization capabilities.en_US
DC.subject監督式學習zh_TW
DC.subject解耦式範例zh_TW
DC.subject正則化的局部損失函數zh_TW
DC.subject自適應推理輸出zh_TW
DC.subject動態擴增層zh_TW
DC.subject動態增加新特徵zh_TW
DC.subjectSupervised Learningen_US
DC.subjectDecoupled Paradigmen_US
DC.subjectRegularized Local Lossen_US
DC.subjectAdaptive Inference Pathen_US
DC.subjectDynamic Extended Layersen_US
DC.subjectDynamic Extended Layers with new featuresen_US
DC.title訊息正則化的監督解耦式學習之研究zh_TW
dc.language.isozh-TWzh-TW
DC.titleDecoupled Supervised Learning with Information Regularizationen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明