English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 78852/78852 (100%)
造访人次 : 36244682      在线人数 : 774
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/81221


    题名: 用於人臉驗證的緊湊且低成本的卷積神經網路;Compact and Low-Cost CNN for Face Verification
    作者: 哈帝恩;Rahadian, Fattah Azzuhry
    贡献者: 資訊工程學系
    关键词: 人臉驗證;輕量級;卷積神經網絡;複雜度;face verification;lightweight;convolutional neural network;complexity
    日期: 2019-07-29
    上传时间: 2019-09-03 15:39:45 (UTC+8)
    出版者: 國立中央大學
    摘要: 近年來,人臉驗證已廣泛用於保護網際網路上的各種交易行為。人臉驗證最先進的技術為卷積神經網絡(CNN)。然而,雖然CNN有極好的效果,將其佈署於行動裝置與嵌入式設備上仍具有挑戰性,因為這些設備僅有受限的可用計算資源。在本論文中,我們提出了一種輕量級CNN,並使用多種方法進行人臉驗證。首先,我們提出ShuffleNet V2的修改版本ShuffleHalf,並將其做為FaceNet算法的骨幹網路。其次,使用Reuse Later以及Reuse ShuffleBlock方法來重用模型中的特徵映射圖。Reuse Later通過將特徵直接與全連接層相連來重用可能未使用的特徵。同時,Reuse ShuffleBlock重用ShuffleNet V2(ShuffleBlock)的基本構建塊中第一個1x1卷積層輸出的特徵映射圖。由於1x1卷積運算在計算上很昂貴,此方法用於降低模型中1x1卷積的比率。第三,隨著通道數量的增加,卷積核大小增加,以獲得相同的感知域大小,同時計算複雜度更低。第四,深度卷積運算用於替換一些ShuffleBlocks。第五,將其他現有的現有算法與所提出的方法相結合,以查看它們是否可以提高所提出方法的性能 - 效率權衡。
    在五個人臉驗證測試數據集的實驗結果表明,ShuffleHalf比其他所有方法都具有更高的準確度,並且只需要目前最先進的算法MobileFaceNet的48% FLOPs。通過Reuse ShuffleBlock重用特徵技術,ShuffleHalf的準確性得到進一步提高。該方法將計算複雜度降低到僅為MobileFaceNet的42% FLOPs。同時,改變卷積核大小和使用depthwise repetition都可以進一步降低計算複雜度,使MobileFaceNet的FLOPs只剩下38%,但效果依然優於MobileFaceNet。與一些現有方法的組合不會增加模型的準確性和性能 - 效率權衡。但是,添加shortcut連接和使用Swish激發函數可以提高模型的準確性,而不會顯著增加計算複雜度。;In recent years, face verification has been widely used to secure various transactions on the internet. The current state-of-the-art in face verification is convolutional neural network (CNN). Despite the performance of CNN, deploying CNN in mobile and embedded devices is still challenging because the available computational resource on these devices is constrained. In this paper, we propose a lightweight CNN for face verification using several methods. First, a modified version of ShuffleNet V2 called ShuffleHalf is used as the backbone network for the FaceNet algorithm. Second, the feature maps in the model are reused using two proposed methods called Reuse Later and Reuse ShuffleBlock. Reuse Later works by reusing the potentially unused features by connecting the features directly to the fully connected layer. Meanwhile, Reuse ShuffleBlock works by reusing the feature maps output of the first 1x1 convolution in the basic building block of ShuffleNet V2 (ShuffleBlock). This method is used to reduce the percentage of 1x1 convolution in the model because 1x1 convolution operation is computationally expensive. Third, kernel size is increased as the number of channels increases to obtain the same receptive field size with less computational complexity. Fourth, the depthwise convolution operations are used to replace some ShuffleBlocks. Fifth, other existing previous state-of-the-art algorithms are combined with the proposed method to see if they can increase the performance-efficiency tradeoff of the proposed method.
    Experimental results on five testing datasets show that ShuffleHalf achieves better accuracy than all other baselines with only 48% FLOPs of the previous state-of-the-art algorithm, MobileFaceNet. The accuracy of ShuffleHalf is further improved by reusing the feature. This method can also reduce the computational complexity to only 42% FLOPs of MobileFaceNet. Meanwhile, both changing kernel size and using depthwise repetition can further decrease computational complexity to only 38% FLOPs of MobileFaceNet with better performance than MobileFaceNet. Combination with some existing methods does not increase the accuracy nor performance-efficiency tradeoff of the model. However, adding shortcut connections and using Swish activation function can improve the accuracy of the model without any noticeable increase in the computational complexity.
    显示于类别:[資訊工程研究所] 博碩士論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    index.html0KbHTML214检视/开启


    在NCUIR中所有的数据项都受到原著作权保护.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明