博碩士論文 109525003 完整後設資料紀錄

DC 欄位 語言
DC.contributor軟體工程研究所zh_TW
DC.creator王承凱zh_TW
DC.creatorCheng-Kai Wangen_US
dc.date.accessioned2022-7-19T07:39:07Z
dc.date.available2022-7-19T07:39:07Z
dc.date.issued2022
dc.identifier.urihttp://ir.lib.ncu.edu.tw:88/thesis/view_etd.asp?URN=109525003
dc.contributor.department軟體工程研究所zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract倒傳遞 (Backpropagation, BP) 是當今深度神經網路更新權重演算法 的基石,但反向傳播因反向鎖定 (backward locking) 的問題而效率不佳。 本研究試圖解決反向鎖定問題,並將提出的新方法命名為 Supervised Contrastive Parallel Learning (SCPL),SCPL 利用監督對比損失函數作為每個卷積層的區域目標函數,因為每一層的區域目標函數間互相隔離, SCPL 可以平行地學習不同卷基層的權重。 本論文亦和過去在神經網路平行化的研究進行比較,探討現存方法 各自的優勢與限制,並討論此議題未來的研究方向。zh_TW
dc.description.abstractBackpropagation (BP) is the cornerstone of today’s deep learning algorithms to update the weights in deep neural networks, but it is inefficient partially because of the backward locking problem. This thesis proposes Supervised Contrastive Parallel Learning (SCPL) to address the issue of backward locking. SCPL uses the supervised contrastive loss as the local objective function for each layer. Because the local objective functions in different layers are isolated, SCPL can learn the weights of different layers in parallel. We compare SCPL with recent works on neural network parallelization. We discuss the advantages and limitations of the existing methods. Finally, we suggest future research directions on neural network parallelization.en_US
DC.subject倒傳遞zh_TW
DC.subject反向鎖定zh_TW
DC.subject監督對比損失函數zh_TW
DC.subject平行化訓練zh_TW
DC.subject監督式 對比平行學習zh_TW
DC.subjectBackpropagationen_US
DC.subjectbackward lockingen_US
DC.subjectsupervised contrastive lossen_US
DC.subjectparallel learningen_US
DC.subjectsupervised contrastive parallel learningen_US
DC.title利用 SCPL 分解端到端倒傳遞演算法zh_TW
dc.language.isozh-TWzh-TW
DC.titleDecomposing End-to-End Backpropagation Based on SCPLen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明