中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/89784
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 81570/81570 (100%)
Visitors : 47025294      Online Users : 104
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/89784


    Title: Multi-Proxy Loss:基於度量學習提出之損失函數用於細粒度圖像檢索;Multi-Proxy Loss: For Deep Metric Learning on Fine-grained Image Retrieval
    Authors: 林宛儀;Lin, Wan Yi
    Contributors: 資訊工程學系
    Keywords: 度量學習;距離學習;圖像檢索;細粒度圖像;卷積神經網路;Deep Metric Learning;Distance metric learning;Image Retrieval;Fine-grained;Convention Network
    Date: 2022-07-21
    Issue Date: 2022-10-04 11:59:38 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 本篇論文針對圖像檢索(Image retrieval )的任務上提出了一個新的損失函數。此方法基於Proxy_NCA以及Proxy_Anchor的方法上加上了多個代表點的方法,來提升樣本的豐富性。使得Batch size減少的情況下也能達到跟原來較大的batch size一樣的效果。並且使用SoftMax函數對類內代表點做加權。使得重要的代表點能得到更多的學習資源。除此損失函數地改良之外,也對現有的ResNet50進行了修改,只使用RestNet50的前三層做為特徵擷取,取消了ResNet50第三層的下採樣。並且加入了Attention機制取代原本ResNet50的第四層。Attention使用了SoftPlus函數對特徵圖的特徵做加權。使得重要的特徵能更明顯,不重要的特徵減少關注度。 相較於傳統Attention使用SoftMax函數能得到更好的效果。不管是新提出的損失函數,或是改良過後的ResNet50都相較於原始方法Recall@1都有很大的提升。;In this paper, we propose a new loss function for Image Retrieval task. The new loss function makes an improvement based on Proxy-NCA and Proxy-Anchor Loss by adopting multiple proxies, to promote positive sample variety. Its shows better performance than Proxy-Anchor Loss even in the small batch size. Besides, we weighted intra-class proxy by SoftMax function to make important samples receive a higher gradient while training. In addition, we make some changes on ResNet50 by only using the first three-layer and adding a new attention module by using SoftPlus function to replace SoftMax. Finally, we obtain well results on recall@1 via our new method.
    Appears in Collections:[Graduate Institute of Computer Science and Information Engineering] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML46View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明