博碩士論文 109522026 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊工程學系zh_TW
DC.creator楊緣智zh_TW
DC.creatorYuan-Chih Yangen_US
dc.date.accessioned2022-7-19T07:39:07Z
dc.date.available2022-7-19T07:39:07Z
dc.date.issued2022
dc.identifier.urihttp://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=109522026
dc.contributor.department資訊工程學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract在深度學習訓練中,Dropout 和 DropConnect 是常被用來解決過度 擬合的正則化技術,Dropout 和 DropConnect 藉由在訓練過程中以一個 固定機率隨機地捨棄神經元及該神經元前後的連結,使得每個神經元彼 此之間不會過度依賴其他神經元,進而提高模型泛化的能力。 本文提出了一種新模型 Gradient DropConnect,它利用每個權重和 偏差的梯度以確定它們在訓練期間的下降捨棄機率。我們進行了一連串 的實驗以驗證這種方法可以有效地緩解過度擬合。zh_TW
dc.description.abstractDropout and DropConnect are regularization techniques often used to address the overfitting issue in deep learning. Dropout and DropConnect randomly discard neurons or links with a fixed probability during training so that each neuron does not depend too much on other neurons, thereby improving the model’s generalization ability. This paper proposes a new model, Gradient DropConnect, which leverages the gradient of each weight and bias to determine their dropping probabilities during training. We conducted thorough experiments to validate that such an approach can effectively mitigate overfitting.en_US
DC.subject過度擬合、正則化、Dropout、DropConnect、泛化zh_TW
DC.subjectOverfitting, Regularization, Dropout, DropConnect, Generalizationen_US
DC.title藉由權重之梯度大小調整DropConnect的捨棄機率來訓練神經網路zh_TW
dc.language.isozh-TWzh-TW
DC.titleTraining a neural network by adjusting the drop probability in DropConnect based on the magnitude of the gradienten_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明