博碩士論文 985202049 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:21 、訪客IP:3.129.13.201
姓名 劉宣延(Hsuan-yen Liu)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 在姿態能量影像中使用區域紋理特徵之性別辨識
(Gender Classification Using Local Texture Features from Gait Energy Images)
相關論文
★ 使用視位與語音生物特徵作即時線上身分辨識★ 以影像為基礎之SMD包裝料帶對位系統
★ 手持式行動裝置內容偽變造偵測暨刪除內容資料復原的研究★ 基於SIFT演算法進行車牌認證
★ 基於動態線性決策函數之區域圖樣特徵於人臉辨識應用★ 基於GPU的SAR資料庫模擬器:SAR回波訊號與影像資料庫平行化架構 (PASSED)
★ 利用掌紋作個人身份之確認★ 利用色彩統計與鏡頭運鏡方式作視訊索引
★ 利用欄位群聚特徵和四個方向相鄰樹作表格文件分類★ 筆劃特徵用於離線中文字的辨認
★ 利用可調式區塊比對並結合多圖像資訊之影像運動向量估測★ 彩色影像分析及其應用於色彩量化影像搜尋及人臉偵測
★ 中英文名片商標的擷取及辨識★ 利用虛筆資訊特徵作中文簽名確認
★ 基於三角幾何學及顏色特徵作人臉偵測、人臉角度分類與人臉辨識★ 一個以膚色為基礎之互補人臉偵測策略
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 性別辨識是近年廣泛發展的一個領域,若能在日常生活中應用性別辨識,如:智慧型安全監控系統、個人化機器人與顧客和行人統計系統等,生活將會更方便且安全。
長久以來的性別辨識研究多以人臉、聲音、姿態等為基礎,其中以姿態為基礎的性別辨識為有效率且可行的,在側面拍攝角度為〖90〗^°且一般穿著時有不錯的表現,但隨著姿態影像會因視角變化、穿著變化與攜帶背包等問題而改變,當中更伴隨姿態週期取得不易,以至於辨識率大幅降低。因此本論文提出一個以Gait Energy Image(GEI)為基礎,在GEI上擷取區域紋理特徵後,以Support Vector Machine(SVM)為分類器的性別辨識系統架構,以提升其辨識率。
首先將攝影機所拍攝之影片,經由姿態週期計算方法,取得姿態能量影像(GEI),在GEI上以特徵擷取Local Directional Pattern (LDP)或Local Binary Pattern (LBP)計算紋理特徵,並以切割區塊的方式記錄局部性的性別特徵,藉由每個區塊中的統計特徵分布並串接起來成為一個特徵向量,以代表男女性別,之後再使用SVM分類器進行性別辨識。
本文探討此架構在監控系統中之單一角度辨識率、穿著變化、攜帶背包與訓練樣本數量對辨識結果的影響,並實驗在單一角度訓練,測試其他角度的視角變化情形,實驗結果顯示,這個架構對性別辨識,是穩定且有效的。
摘要(英) Recently, gender classification is widely developed in many commercial systems in our daily life. For examples, the statistical data collection module for consumers’ genders and ages is embedded into an advertisement system. An intelligent surveillance algorithm analyzes the human gender and activities. Many gender classification algorithms proposed in literatures use the face, voice or gait features. However, face and voice features need a close contact with people. Human gaits are the valid feature for gender classification in a long distance. The main challenge for gait-based gender classification is the view angle from cameras due to the non-rigid body. In addition to view angles, clothing, shoes, and carrying conditions also reduce the performance.
In this work, a gait energy image (GEI)-based algorithm is proposed for gender classification. The GEIs are constructed from the aligned gait silhouettes. The gait cycles are first estimated and the silhouettes of gaits are aligned from the input video sequences. After the pre-processing, gait cycle estimation, and silhouette alignment, GEIs are constructed and separated into several regions. Next, local texture features, local binary pattern (LBP) or local directional pattern (LDP), are extracted from the separated GEIs’ regions. A feature vector concatenating the LBP histograms of small regions are extracted. An SVM-based classifier is adopted for gender classification.
In order to show the performance of the proposed method, several conditions, including various view angles, clothing variations, and carrying bags, are conducted in the experiments. Instead of the training and testing images both in a 90-degree angle, test images in various view angles are classified using the training images in a 90-degree angle. Experimental results are demonstrated and high recognition rates are achieved.
關鍵字(中) ★ 姿態
★ 視角
★ 穿著變化
★ 攜帶背包
★ 姿態能量影像
★ 性別辨識
★ 區域二元特徵
★ 區域方向性特徵
關鍵字(英) ★ gender classification
★ gait
★ view angle
★ clothing
★ carry condition
★ Gait energy image
★ Local binary pattern
★ Local directional pattern
論文目次 摘要 i
Abstract ii
誌謝 iii
目錄 iv
圖目錄 vi
表目錄 viii
第一章 緒論 1
1.1 研究動機 1
1.2 文獻探討 2
1.3 系統架構 4
1.4 論文架構 5
第二章 姿態影像 6
2.1 姿態影像的分析與探討 6
2.2 姿態影像前處理 9
2.2.1 前景物偵測與二值化 9
2.2.2 前景物的校正與正規化 11
2.3 姿態週期計算 13
第三章 特徵擷取與性別辨識 18
3.1 特徵擷取-區域二元特徵 18
3.2 特徵擷取-區域方向性特徵 21
3.3 多角度姿態影像性別辨識 23
3.4 SVM分類器 25
第四章 實驗結果與討論 28
4.1 實驗資料庫 28
4.2 實驗結果 29
第五章 結論與未來工作 43
參考文獻 44
參考文獻 [1] L. Cao, M. Dikmen, Y. Fu, and T. S. Huang, “Gender recognition from body,” in Proc. 16th ACM Int. Conf. Multimedia, 2008, pp. 725-728.
[2] G. Huang and Y. Wang, “Gender classification based on fusion of multi-view gait sequences,” in Proc. 8th Asian Conf. Computer Vision, 2007, pp. 462-471.
[3] X. Li, S. J. Maybank, S. Yan, D. Tao, and D. Xu, “Gait components and their application to gender recognition,” IEEE Trans. Syst. Man Cybern. C, Appl. Rev., vol. 38, no. 2, pp. 145-155, 2008.
[4] S. Yu, T. Tan, K. Huang, K. Jia, and X. Wu, “A study on gait-based gender classification,” IEEE Trans. Image Process., vol. 18, no. 8, pp. 1905-1910, 2009.
[5] L. T. Kozlowski and J. E. Cutting, “Recognizing the sex of a walker from a dynamic point-light display,” Percpt. Psychophys., vol. 21, pp. 575-580, 1977.
[6] L. Lee and W. E. L. Grimson, “Gait analysis for recognition and classification,” in Proc. 5th IEEE Int. Conf. Automatic Face and Gesture Recognition, Washington, DC, May 2002, pp. 155-162.
[7] S. Lee, Y. Liu, and R. Collins, “Shape variation-based frieze pattern for robust gait recognition,” IEEE Int. Conf. Computer Vision and Pattern Recognition, 2007.
[8] S. Sarkar, P. J. Phillips, Z. Liu, I. R. Vega, P. Grother, and K. W. Bowyer, “The human id gait challenge problem: Data sets, performance, and analysis,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 27, no. 2, pp. 162-177, Feb. 2005.
[9] L. Wang, H. Ning, T. Tan, and W. Hu, “Fusion of static and dynamic body biometrics for gait recognition,” IEEE Trans. Circuits Syst. Video Technol., vol. 14, no. 2, pp. 149-158, Feb. 2004.
[10] J. Han and B. Bhanu, “Individual recognition using gait energy image,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 28, no. 2, pp. 316-322, Feb. 2006.
[11] A. F. Bobick and J. W. Davis, “The recognition of human movement using temporal templates,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 23, no. 3, pp. 257-267, Mar. 2001.
[12] J. Liu and N. Zhang, “Gait history image: a novel temporal template for gait recognition,” in IEEE Int. Conf. Multimedia Expo., 2007, pp. 663-666.
[13] S. Yu, D. Tan, and T. Tan, “A framework for evaluating the effect of view angle, clothing and carrying condition on gait recognition,” in Proc. 18th Int. Conf. Pattern Recognition, 2006, pp. 441-444.
[14] P. S. Liao, T. S. Chen, and P. C. Chung, “A fast algorithm for multilevel thresholding,” J. Inform. Sci. Eng., vol. 17, pp. 713-737, 2001.
[15] Q. Ma, S. Wang, D. Nie, and J. Qiu, “Recognizing humans based on gait moment image,” The 8th ACIS Int. Conf. on Softw. Eng. Artificial Intell. Networking, and Parallel/Distrib. Computing, 2007.
[16] T. Ojala, M. Pietikainen, and T. Maenpaa, “Multiresolution gray-scale and rotation invariant texture classification with local binary patterns,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 24, no. 7, pp. 971-987, 2002.
[17] M. Pietikainen and G. Zhao, “Local texture descriptors in computer vision,” in Proc. Int. Conf. Computer Vision, 2009.
[18] T. Jabid, M. H. Kabir, and O. Chae, “Gender classification using local directional pattern (LDP),” in Proc. IEEE 12th Int. Conf. Pattern Recognition, 2010.
[19] V. Vapnik, “Structure of statistical learning theory,” Computational Learning and Probabilistic Reasoning, 1996, pp. 3-31.
指導教授 范國清(Kuo-chin Fan) 審核日期 2011-7-26
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明