博碩士論文 106553007 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:8 、訪客IP:3.216.28.250
姓名 紀佩妤(Pei-Yu Chi)  查詢紙本館藏   畢業系所 通訊工程學系在職專班
論文名稱 狗鼻紋理特徵擷取和犬隻身分識別
(Texture Feature Extraction of Dog Nose Prints and Its Application in Pet Identity Identification)
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2025-1-13以後開放)
摘要(中) 市面上主要採用侵入式的RFID晶片方式進行犬隻身分識別,對犬隻健康造成疑慮,本研究以非侵入式的方式對犬隻做身分辨識,提出了一個多模態狗鼻紋身分辨識系統,分別使用局部二值模式(Local Binary Pattern,LBP)、灰階共生矩陣(Gray Level Co-occurrence Matrix, GLCM)、灰階梯度共生矩陣(Gray-Gradient Co-occurrence Matrix, GGCM)三個不同的紋理特徵提取方式來提取狗鼻紋理特徵向量,再將這三組特徵向量各別結合特徵分類器PNN (機率神經網路, probabilistic neural network - PNN),得到三組最佳的推論機率後,將每一個模組的推論機率進行加權融合,看其辨識率,實驗結果顯示,使用本文提出的決策融合方法,比三種模態各自辨識率效果還要更好,在訓練資料30筆的情況下進行實驗,LBP結合特徵分類器PNN的辨識率為88%,GLCM結合特徵分類器的辨識率為79%,GGCM結合特徵分類器的辨識率為90%,而本文提出的決策融合PNN辨識率為100%;我們將訓練資料減少為10筆的情況下做實驗,LBP結合特徵分類器PNN的辨識率為75%,GLCM結合特徵分類器的辨識率為72%,GGCM結合特徵分類器的辨識率為80%,而本文提出的決策融合PNN辨識率為95%,因此可以證明多模態狗鼻紋身分辨識系統具有良好的辨識性能。
摘要(英) In the market, intrusive RFID chips are mainly used to identify dogs which causes doubts about the health of dogs. In this study dogs are identified in a non-invasive way. A multi-modal dog nose identification system is proposed. Using three different texture feature extraction methods LBP (Local Binary Pattern), GLCM (Gray Level Co-occurrence Matrix) and GGCM(Gray-Gradient Co-occurrence Matrix) to extract dog nose texture feature vectors and then combine these three feature vectors with the feature classifier PNN(probabilistic neural network) to obtain the three sets of optimal inference probability. The inference probability of each module is weighted to see the recognition rate. The experimental results show that the decision fusion method proposed in this paper is more effective than the recognition rate of each of the three modes. It is performed with 30 training data. In the experiment the recognition rate of LBP combined with feature classifier PNN is 88%, the recognition rate of GLCM combined with feature classifier is 79%, the recognition rate of GGCM combined with feature classifier is 90%, and the recognition rate of decision fusion PNN proposed in this paper is 100 %. We reduce the training data to 10 experiments, the recognition rate of LBP combined with feature classifier PNN is 75%, and the recognition rate of GLCM combined with feature classifier is 72% GGCM binding recognition feature classifier was 80%, while the proposed decision fusion PNN recognition rate of 95%. It can be proved that the multi-modal Dog Nose Prints and Its Application in Pet Identity Identification has good recognition performance.
關鍵字(中) ★ 寵物身份辨識
★ 局部二值模式
★ 灰階共生矩陣
★ 灰階梯度共生矩陣
★ 機率神經網路
★ 決策融合機率神經網路
關鍵字(英) ★ LBP
★ GLCM
★ GGCM
★ PNN
論文目次 摘要 I
Abstract II
致謝 III
目錄 1
圖目錄 4
表目錄 6
第一章、 緒論 1
1.1 研究背景與動機 1
1.2 研究目標 3
1.3 論文架構 3
第二章、 方法回顧 4
2.1 局部二值模式 4
2.2 灰階共生矩陣 7
2.3 灰階梯度共生矩陣 12
2.4 機率神經網路 14
2.4.1 貝氏分類器 14
2.4.2 Parzen視窗法 15
2.4.3 PNN分類器架構 17
第三章、 狗鼻紋身分辨識系統設計 19
3.1 系統架構 19
3.2 紋理特徵擷取模組設計 20
3.2.1 LBP 模組設計 20
3.2.2 GLCM模組設計 23
3.2.3 GGCM模組設計 25
3.3狗鼻紋分類器設計 27
3.3.1 PNN紋理分類器 28
3.3.2決策融合分類器 30
第四章、 實驗 32
4.1 實驗資料庫建置 32
4.2 實驗環境 33
4.3 辨識性能實驗 33
4.4 決策融合實驗 38
第五章、 結論與未來展望 41
5.1 結論 41
5.2 未來展望 41
參考文獻 42
附錄一 45
參考文獻 [1] J. McGrath, ”How Pet Microchipping Works,” 2008 [Online]. Available: http://science.howstuffworks.com/innovation/everyday-innovations/pet-microchip.htm
[2] K. Albrechti, ”Microchip-induced tumors in laboratory rodents and dogs: A review of the literature 1990–2006,” IEEE International Symposium on Technology and Society, pp. 337-349, 2010.
[3] N. Coldea, ”Nose prints as a method of identification in dogs,” Veterinary Quarterly, PP. 60, 2011.
[4] K. Karthik, S. Chakraborty, S. Banik, ”Muzzle Analysis for Biometric Identification of Pigs,” in 2017 Ninth International Conference on Advances in Pattern Recognition (ICAPR), pp. 1-6, 2017.
[5] A. Tharwat, T. Gaber, A. E. Hassanien, A. H. Hasssan, F. T. Mohamed, ”Cattle Identification Using Muzzle Print Images Based on Texture Features Approach,” Proceedings of the Fifth International Conference on Innovations in Bio-Inspired Computing and Applications IBICA, pp. 217-227, 2014.
[6] W. Kusakunniran, A. Wiratsudakul, U. Chuachan, S. Kanchanapreechakorn, T. Imaromkul, ”Automatic cattle identification based on fusion of texture features extracted from muzzle images,” in 2018 IEEE International Conference on Industrial Technology (ICIT), pp. 1484-1489, 2018.
[7] R. E. Sánchez-Yáñez, E. V. Kurmyshev, F. J. Cuevas, ”A framework for texture classification using the coordinated clusters representation,” Pattern Recognition Letters, vol. 24, no. 1-3, pp. 21-31, 2003.
[8] Z. Shang, M. Li, ”Combined Feature Extraction and Selection in Texture Analysis,” International Symposium on Computational Intelligence and Design (ISCID), vol. 1, pp. 398-401 , 2016.
[9] ”機器視覺表面缺陷檢測綜述,” 2019 [Online]. Available: https://www.itread01.com/content/1547385319.html
[10] T. Ojala, M. Pietikainen, T. Maenpaa, ”Multiresolution gray-scale and rotation invariant texture classification with local binary patterns,” IEEE Transactions on pattern analysis and machine intelligence, vol. 24, no. 7, pp. 971–987, 2002.
[11] R. Haralick, K. Shanmugam, I. Dinstein, ”Textural Features for Image Classification,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 3, pp. 610, 1973.
[12] J. Hong, ”Gray level-gradient cooccurrence matrix texture analysis method[J],” Acta Automatica Sinica, vol. 10, no. 1, pp. 22-25, 1984.
[13] A. Padma, R. Sukanesh, ”Automatic Classification and Segmentation of Brain Tumor in CT Images using Optimal Dominant Gray level Run length Texture Features”, International Journal of Advanced Computer Sciences & Applications, vol. 2, no. 10, 2011.
[14] M. Bashar, T. Matsumoto, N. Ohnishi, ”Wavelet transform-based locally orderless images for texture segmentation,” Pattern Recognition Letters, vol. 24, no. 15, pp. 2633-2650, 2003.
[15] S. Grigorescu, N. Petkov, P. Kruizinga, ”Comparison of texture features based on Gabor filters,” IEEE Transactions on Image Processing, vol. 11, no. 10, pp. 142-147, 1999.
[16] W. Wen, A. Xia, ”Verifying edges for visual inspection purposes,” Pattern Recognition Letters, vol. 20, no. 3, pp. 315-328, 1999.
[17] G. Cross, A. Jain, ”Markov Random Field Texture Models,” IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 5, no. 1, pp. 25-39, 1983.
[18] M. Xi, L. Chen, D. Polajnar, W. Tong, ”Local binary pattern network: A deep learning approach for face recognition,” IEEE International Conference on Image Processing (ICIP), pp. 3224-3228, 2016.
[19] R. Touahri, N. AzizI, N. Hammami, M. Aldwairi, F. Benaida,”Automated Breast Tumor Diagnosis Using Local Binary Patterns (LBP) Based on Deep Learning Classification,” International Conference on Computer and Information Sciences (ICCIS), 2019.
[20] J. Tan, Y. Gao, W. Cao, M. Pomeroy, S. Zhang, Y. Huo, L. Li, Z. Liang, ”GLCM-CNN: Gray Level Co-occurrence Matrix based CNN Model for Polyp Diagnosis,” IEEE EMBS International Conference on Biomedical & Health Informatics (BHI), 2019.
[21] F. Shi, G. Chen, Y. Wang, N. Yang, Y. Chen, N. Dey, R. Sherratt, ”Texture features based microscopic image classification of liver cellular granuloma using artificial neural networks,” in IEEE 8th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), pp. 432-439, 2019.
[22] S. Aygün, E. Güneş, ”A benchmarking: Feature extraction and classification of agricultural textures using LBP, GLCM, RBO, Neural Networks, k-NN, and random forest,” in 6th International Conference on Agro-Geoinformatics, 2017.
[23] D. Clausi, ”An analysis of co-occurrence texture statistics as a function of grey level quantization,” Canadian Journal of Remote Sensing, vol. 28, no. 1, pp. 45–62, 2002.
[24] D. F. Specht, ”Probabilistic neural networks,” Neural networks, vol. 3, no. 1, pp. 109-118, 1990.
指導教授 陳慶瀚 陳永芳 審核日期 2020-1-20
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明