博碩士論文 102521021 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:73 、訪客IP:34.231.21.105
姓名 章坤瀧(Kung-Long Zhang)  查詢紙本館藏   畢業系所 電機工程學系
論文名稱 複雜背景之 3 維深度手勢辨識與追蹤
(Three-Dimensional Hand Recognition and Tracking with Depth Information under Complicated Environments)
相關論文
★ 即時的SIFT特徵點擷取之低記憶體硬體設計★ 即時的人臉偵測與人臉辨識之門禁系統
★ 具即時自動跟隨功能之自走車★ 應用於多導程心電訊號之無損壓縮演算法與實現
★ 離線自定義語音語者喚醒詞系統與嵌入式開發實現★ 晶圓圖缺陷分類與嵌入式系統實現
★ 語音密集連接卷積網路應用於小尺寸關鍵詞偵測★ G2LGAN: 對不平衡資料集進行資料擴增應用於晶圓圖缺陷分類
★ 補償無乘法數位濾波器有限精準度之演算法設計技巧★ 可規劃式維特比解碼器之設計與實現
★ 以擴展基本角度CORDIC為基礎之低成本向量旋轉器矽智產設計★ JPEG2000靜態影像編碼系統之分析與架構設計
★ 適用於通訊系統之低功率渦輪碼解碼器★ 應用於多媒體通訊之平台式設計
★ 適用MPEG 編碼器之數位浮水印系統設計與實現★ 適用於視訊錯誤隱藏之演算法開發及其資料重複使用考量
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 隨著近幾年來浮空手勢操作的發展,人們逐漸從傳統鍵盤與滑鼠
的操作介面,轉變為更符合人類的直覺操作模式,如:手勢操作。本
論文提出一個基於三維深度的手勢辨識與追蹤演算法,本系統使用低
成本的雙攝影機來計算深度影像,不但可以提供手勢的深度資訊,也
能在嚴峻複雜的背景中正常運作。
目前大部分的手勢偵測演算法採用膚色或運動量值做為前處理
步驟,但只透過膚色濾除和運動量無法在含有相近顏色背景下維持此
系統的功能性,本論文提出一個適應性的膚色深度過濾,此方法可以
有效分離出系統需要的手部區塊,也能改善追蹤演算法的成效。最後
透過深度資訊完成深度動態手勢辨識,經過多位使用者測試,手勢方
向移動功能準確率 93.7%,深度推拉功能準確率 95.6%,手勢旋轉功
能準確率 94.5%,動態手勢準確率 85.92%。
摘要(英) Accompany with mid-air control system have been developed in
recent years, people gradually change their usage from tradition
keyboard and mouse to the intuitive manner, like hand gesture control.
This thesis proposed a hand recognition and tracking with depth
information. We use stereo camera to capture stereo image and
calculate depth map. The system not only can provide depth
information but also can work under critical backgrounds.
Most methods of hand detection apply skin filter or motion filter
as one of pre-processing. However, only applying skin filter or motion
filter as segmentation step can’t maintain system function correct
while background pixels are close to skin color. In the proposed, we
adopt adaptive depth filter which can separate foreground which
improve performance on tracking algorithm. We also proposed
dynamic gesture recognition by using depth data. Our accuracy of
direction function is 93.7%, accuracy of push/pull function is 95.6%,
accuracy of rotation function is 93.7%, accuracy of dynamic function
is 85.92%.
關鍵字(中) ★ 手勢辨識
★ 雙鏡頭深度
★ 動態手勢
關鍵字(英)
論文目次 摘要................................................................................................................................ I
ABSTRACT ............................................................................................................... II
TABLE OF CONTENTS ....................................................................................... III
LIST OF FIGURES ............................................................................................... IIV
LIST OF TABLES ................................................................................................... VI
CHAPTER 1 Introduction ...................................................................................... 1
1.1 BACKGROUND .............................................................................................. 1
1.2 MOTIVATION ............................................................................................... 4
1.3 THESIS ORGANIZATION .............................................................................. 5
CHAPTER 2 Related Works ................................................................................... 6
2.1 OVERVIEW ................................................................................................... 6
2.2 HAND GESTURE RECOGNITION ................................................................. 7
2.3 DEPTH INFORMATION EXTRACTION ....................................................... 10
2.3.1. Kinect ...................................................................................................... 10
2.3.2. Stereo Matching ...................................................................................... 12
2.4 HAND TRACKING ....................................................................................... 14
CHAPTER 3 Proposed Algorithm ....................................................................... 17
3.1 OVERVIEW ................................................................................................. 17
3.2 PRE-POCESSING ......................................................................................... 19
3.3 DEPTH EXTRACTION ................................................................................. 23
3.4 SKIN PROCESSING ..................................................................................... 30
3.5 ADAPTIVE DEPTH DYNAMIC THRESHOLD .............................................. 33
3.6 HAND DETECTION AND TRACKING .......................................................... 36
3.7 GESTURE RECOGNITION ........................................................................... 41
CHAPTER 4 Experimental Results and Analysis............................................... 45
CHAPTER 5 Conclusion ....................................................................................... 54
REFERENCES ......................................................................................................... 56
參考文獻 [1] L. Lacassagne, M. Milgram, P. Garda, “Motion detection, labeling,
data association and tracking, in real-time on RISC computer”,
IEEE Image Analysis and Processing, 1999. Proceedings.
International Conference on, pp. 520-525.
[2] M. Turk, Handbook of Virtual Environment Technology. Lawrence
Erlbaum Associates, Inc., 2001.
[3] C. Manresa, J. Varona, R. Mas, Francisco J. Perales(2005) “Hand
Tracking and Gesture Recognition for Human-Computer
Interaction”, Journal of Electronic Letters on Computer Vision
and Image Analysis, pp. 96-104.S. Soro and W. Heinzelman, “A
survey of visual sensor networks,” Adv. Multimedia, pp. 1–22, May
2009.
[4] B. Yi, F.C. Harris, L. Wang, Y. Yan (2005) “Real-Time Natural
Hand Gestures”, Proceedings of IEEE Computing in Science &
Engineering and the American Institute of Physics, vol. 7, no. 3,
pp. 92-97, May , 2005.
[5] X. Deyou, “A neural approach for hand gesture recognition in
virtual reality driving training system of SPG,” Proc. of
International Conference on Pattern Recognition, ICPR’06 , pp.
519-522, 2006.
[6] D. B. Nguyen, S. Enokida, and E. Toshiaki, “Real-time hand
tracking and gesture recognition system,” IGVIP’05, CICC, pp.
362-368, 2005.
[7] M. Elmezain, A. Al-Hamadi, and B. Michaelis, “Real-time capable
system for hand Motion detection, labeling, data association and
tracking gesture recognition using hidden markov models in stereo
color image sequences,” The Journal of WSCG’08, Vol. 16(1), pp.
65-72, 2008.
[8] P. Kumar, S. S. Rautaray, and A. Agrawal, “Hand data glove: A
new generation real-time mouse for human-computer interaction,”
International Conference on Recent Advances in Information
Technology (RAIT),2012, pp. 750-755.
[9] A. Erol, G. Bebis, M. Nicolescu, R. D. Boyle, and X. Twombly,
“Vision-based hand pose estimation: A review,” Comput. Vis.
Image Understanding, vol. 108, no. 1–2, pp. 52–73, Oct. 2007.
[10] J. P. Wachs, M. Kolsch, H. Stern, and Y. Edan, “Vision-based
handgesture applications,” Commun. ACM, vol. 54, no. 2, pp. 60–
71, Feb. 2011.
[11] S. Mitra and T. Acharya, “Gesture recognition: A survey,” IEEE
Transactions on Systems, Man, and Cybernetics, Part C
(Applications and Reviews), vol. 37, no. 3, pp. 311–324, Apr. 2007.
[12] R. W. Rahmat, Z. H. Al-Tairi, M. I. Saripan, and P. S. Sulaiman,
“Removing Shadow for Hand Segmentation Based on Background
Subtraction,” in International Conference on Advanced Computer
Science Applications and Technologies (ACSAT), 2012,PP.
481-485.
[13] Tsung-Han Tsai, Chih-Chi Huang and Kung-Long Zhang,
“Embedded Virtual Mouse System by Using Hand Gesture
Recognition”, IEEE International Conference on Consumer
Electronics - Taiwan (ICCE-TW), pp.352-353, June, 2015.
[14] Q. Chen, N. D. Georganas, and E. M. Petriu, “Hand Gesture
Recognition Using Haar-Like Features and a Stochastic
Context-Free Grammar,” IEEE Transactions on Instrumentation and
Measurement, vol. 57, No.8, pp.1562-1571, August, 2008.
[15] Po-Kuan Huang, Tung-Yang Lin, Hsu-Ting Lin, Chi-Hao Wu,
Ching-Chun Hsiao, Chao-Kang Liao, Peter Lemmens, “Real-time
stereo matching for 3D hand gesture recognition,” in IEEE
International SoC Design Conference (ISOCC), 2012, pp.29-32.
[16] Cheng-Yuan Ko, Chung-Te Li, Chien Wu, and Liang-Gee Chen,
“3D hand localization by low cost webcams,” IS&T/SPIE
Electronic Imaging (IS&T/SPIE EI), Jan, 2013.
[17] Cheng Tang, Yongsheng Ou, Guolai Jiang, Qunqun Xie, Yangsheng
Xu, “Hand tracking and pose recognition via depth and color
information,” in IEEE International Conference on Robotics and
Biomimetics (ROBIO), 2012, pp.1104-1109.
[18] Ryosuke Araki, Seiichi Gohshit and Takeshi Ikenaga, “Real-Time
Both Hands Tracking Using CAMshift with Motion Mask and Probability Reduction by Motion Prediction,” Proc. Asia-Pacific Signal & Information Processing Association Annual Summit and Conference (APSIPA ASC), 2012.
[19] Chenyang Chen, Mingmin Zhang, Kaijia Qiu, Zhigeng Pan, “Real-Time Robust Hand Tracking Based on CAMshift and Motion Velocity,” IEEE International Conference on Digital Home (ICDH), pp.20-24, 2014.
[20] C. C. Hsieh, D. H. Liou, and D. Lee, “A Real Time Hand Gesture
Recognition System Using Motion History Image ,” in IEEE
International Conference on Singal Processing Systems (ICSPS),
vol. 2, pp.394-398, July, 2010.
[21] L. Lin, Y. Cong, and Y. Tang, “Hand gesture recognition using
RGB-D cue”, IEEE International Conference on Information and
Automation (ICIA),pp. 311-316, June, 2012.
[22] Youwen Wang, Cheng Yang, Xiaoyu Wu, Shengmiao Xu, Hui Li,
“Kinect Based Dynamic Hand Gesture Recognition Algorithm
Research,” in IEEE International Conference on Intelligent
Human-Machine Systems and Cybernetics (IHMSC), vol.1,
pp.274-279, 2012.
[23] Yongquan Xia, Longyuan Guo, Min Huang, Rui Ma, “A New Fast
Matching Approach of Large Disparity Stereopair,” in Congress on
Image and Signal Processing, 2008, pp.286-290.
[24] Luigi Di Stefano, Massimiliano Marchionni, Stefano Mattoccia, “A
fast area-based stereo matching algorithm,” Image and Vision
Computing, vol.22, no.12, pp.983-1005, 2004.
[25] L. T. Cheng, W. K. Chih, A. Tsai, W. C. Chih, “Hand posture
recognition using Hidden Conditional Random Fields,” Advanced
Intelligent Mechatronics, 2009. AIM 2009. IEEE/ASME
International Conference on, pp.1828-1833, 14-17 July, 2009.
[26] https://github.com/Balaje/OpenCV/blob/master/haarcascades/hand.x
ml.
[27] P. Modler, T. Myatt, “Recognition of separate hand gestures by
Time-Delay Neural Networks based on multistate spectral image
patterns from cyclic hand movements, ” Systems, Man and
Cybernetics, 2008. SMC 2008. IEEE International Conference on,
pp.1539-1544, 12-15 Oct., 2008.
[28] Daniel B. Dias, Renata C. B. Madeo, T. Rocha, Helton H. Biscaro,
Sarajane M. Peres, "Hand movement recognition for Brazilian Sign
Language: A study using distance-based neural networks,” Neural
Networks, IEEE - INNS - ENNS International Joint Conference on,
pp. 697-704, 2009.
[29] A. Bellarbi, S. Benbelkacem, N. Z. henda, and M. Belhocine,“Hand
Gesture Interaction using Color-based Method for Tabletop Interfaces,” in IEEE International Symposium on Intelligent Signal
Processing (WISP), pp.1-6, Sept, 2011.
[30] J. Zaletelj, J. Perhavc, and J. F. Tasic, “Vision-based
Human-computer Interface using Hand Gestures,” in International
Workshop on Image Analysis for Multimedia Interactive Services
(WIAMIS′07), 2007, pp. 41.
[31] S. S. Rautaray, and A. Agrawal, “Design of Gesture Recognition
System for Dynamic User Interface Analysis,” in IEEE International
Conference on Technology Enhanced Education (ICTEE), pp. 1-6,
Jan, 2012.
[32] Youwen Wang, Cheng Yang, Xiaoyu Wu, Shengmiao Xu, Hui Li,
“Kinect Based Dynamic Hand Gesture Recognition Algorithm
Research,” in IEEE International Conference on Intelligent
Human-Machine Systems and Cybernetics (IHMSC), vol.1,
pp.274-279, 2012.
[33] Dan Xu, Yen-Lun Chen, Chuan Lin, Xin Kong, Xinyu Wu,
“Real-time dynamic gesture recognition system based on depth
perception for robot navigation,” in IEEE International Conference
on Robotics and Biomimetics (ROBIO), pp.689-694, 2012.
[34] H. X. Duan, Q. Y. Zhang, and W. Ma, “An approach to dynamic
hand gesture modeling and real-time extraction,” in IEEE
International Conference on Communication Software and
Networks (ICCSN), pp.139-142, May, 2011.
[35] A. Aksaç, O. Öztürk and T. Özyer, “Real-time Multi-Objective
Hand Posture/Gesture Recognition by Using Distance Classifiers
and Finite State Machine for Virtual Mouse Operations” in IEEE
International Conference on Electrical and Electronics Engineering
(ELECO) 7th, pp.457-461, Dec, 2011.
[36] E. Foxlin. Motion tracking requirements and technologies.
Handbook of Virtual Environment Technology, pages 163–210,
2002.
[37] T. Takahashi, and F. Kishino, “Hand Gesture Coding based on
Experiments using a Hand Gesture Interface Device”. SIGCHI Bull.
vol. 23, no. 2, pp. 67-74, 1991.
[38] C. Lee, and Y. Xu, “Online Interactive Learning of Gestures for
Human/Robot Interfaces”. In IEEE International Conference on
Robotics and Automation, vol. 4, pp. 2982-2987, Apr, 1996.
指導教授 蔡宗漢(Tsung-Han Tsai) 審核日期 2016-7-21
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明