中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/89632
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 78852/78852 (100%)
Visitors : 38467874      Online Users : 2349
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/89632


    Title: 利用 SCPL 分解端到端倒傳遞演算法;Decomposing End-to-End Backpropagation Based on SCPL
    Authors: 王承凱;Wang, Cheng-Kai
    Contributors: 軟體工程研究所
    Keywords: 倒傳遞;反向鎖定;監督對比損失函數;平行化訓練;監督式 對比平行學習;Backpropagation;backward locking;supervised contrastive loss;parallel learning;supervised contrastive parallel learning
    Date: 2022-07-19
    Issue Date: 2022-10-04 11:50:07 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 倒傳遞 (Backpropagation, BP) 是當今深度神經網路更新權重演算法
    的基石,但反向傳播因反向鎖定 (backward locking) 的問題而效率不佳。
    本研究試圖解決反向鎖定問題,並將提出的新方法命名為 Supervised
    Contrastive Parallel Learning (SCPL),SCPL 利用監督對比損失函數作為每個卷積層的區域目標函數,因為每一層的區域目標函數間互相隔離,
    SCPL 可以平行地學習不同卷基層的權重。
    本論文亦和過去在神經網路平行化的研究進行比較,探討現存方法
    各自的優勢與限制,並討論此議題未來的研究方向。;Backpropagation (BP) is the cornerstone of today’s deep learning al gorithms to update the weights in deep neural networks, but it is inefficient partially because of the backward locking problem. This thesis proposes Supervised Contrastive Parallel Learning (SCPL) to address the issue of backward locking. SCPL uses the supervised contrastive loss as the local objective function for each layer. Because the local objective functions in different layers are isolated, SCPL can learn the weights of different lay ers in parallel. We compare SCPL with recent works on neural network parallelization. We discuss the advantages and limitations of the existing methods. Finally, we suggest future research directions on neural network parallelization.
    Appears in Collections:[Software Engineer] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML182View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明