![]() |
以作者查詢圖書館館藏 、以作者查詢臺灣博碩士 、以作者查詢全國書目 、勘誤回報 、線上人數:18 、訪客IP:13.58.215.45
姓名 蔡嵩陽(Sung-yang Tsai) 查詢紙本館藏 畢業系所 資訊工程學系 論文名稱 即時手型辨識系統及其於家電控制之應用
(A Real-Time Hand Gesture Recognition System and its Application in Manipulating Household Appliances)相關論文 檔案 [Endnote RIS 格式]
[Bibtex 格式]
[相關文章]
[文章引用]
[完整記錄]
[館藏目錄]
[檢視]
[下載]
- 本電子論文使用權限為同意立即開放。
- 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
- 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
摘要(中) 近年來,手型辨識在人機互動的相關研究中,吸引了各個領域的學者投入,而常見的應用有遊戲控制、機械手臂操作、機器人控制及家電控制等等,其簡單既直覺化的操作方式取代了傳統遙控器的使用。
本論文提出一種基於影像方式之即時手型辨識系統,並將系統應用於遠端控制家電。實作的方法先對攝影機讀到的影像做前處理和膚色擷取後,留下手臂資訊,再用切割手臂的演算法將手掌擷取出來,並利用兩種特徵來描述手型,一是手掌輪廓上每一點到手掌質心的距離曲線;二是利用快速傅立葉將該距離曲線轉換所得到的頻率域特徵。最後,將此兩種特徵搭配KNN和決策樹的辨識方法,能夠準確地辨識十一種手型,而利用這些手型和左右揮動的動作組合,轉換成控制家電的指令,藉由無線傳輸傳到我們自製的環境控制器,發射紅外線訊號以達到控制家電之目的。我們的系統目前成功測詴的家電有電視、收音機和電扇等。
我們請了八位受詴者使用手型辨識系統,在實驗中,並不限制使用左右手,且不需要穿戴任何辨識物或手套,也沒有穿著長短袖的限制。第一個實驗僅分析對十一種手型的辨識率,而結果有91%正確率。第二個實驗,我們針對三種家電設計了一些操作情境,並記錄受詴者控制一連串家電指令所需花費的時間,而實驗的結果顯示其時間都在使用者可接受的範圍。
除了在電腦上實作我們的系統外,為了使整套系統縮小化和具有可攜性,方便置放於每個家庭中,我們也將系統實現於嵌入式系統數位訊號處理器上,其在光源充足的室內環境下可與在電腦上的版本有極相似的表現。
摘要(英) Recently, in the field of human computer interaction, hand gesture recognition researches have attracted many researchers from many different kinds of domains. The applications of hand gesture recognition vary from game control, robot arm operation, robot control, household appliance control, etc. Due to its convenience and intuitiveness, the hand-gesture-based controller has the potential of replacing a traditional remote control.
This thesis presents a real-time image-based hand gesture recognition system. The proposed hand gesture recognition system is applied in household appliances control. The implementation of the proposed hand gesture recognition system is as follows. First of all, we locate the arm region from an image captured by a web camera via several image preprocessing operators and the use of skin information. We then use an arm-cutting algorithm to cut out the hand from the arm region. The hand shape is then represented by two kinds of features. The first kind of features is the so-called “distance curve” or “signature” which is a 1-D functional representation of the hand shape. It is formed by plotting the distance from every point on the hand boundary to its center of mass. The second kind of features is the frequency domain parameters of the distance curve transformed by FFT. Finally, the KNN incorporated with a decision tree is adopted to recognize 11 hand gestures based on these two kinds of features. A combination of 11 hand gestures and 2 waving motions will be used as commands to control several household appliances. The recognized commands are wirelessly transmitted to an environment control unit which can translate the received commands to infrared signals and the signals are then pointed to the household appliance to be controlled. Our system has been successfully tested with a TV, a radio and a fan.
Eight subjects were invited to test the proposed hand gesture recognition system. In the experiments, users can either use their right hands or left hands without the need of wearing any accessory such as a hand glove. In addition, there is no restriction on wearing long or short sleeves. The first experiment analyzed the successful recognition rate of the 11 hand gestures and the recognition result came out with 91% successful rate. In second experiment, we test the time performance of the proposed system by testing some scenarios for controlling three household appliances. The performance was tested based on the time interval which a subject took to finish a sequence of commands according to each designed scenario. The results showed that the time performance was acceptable.
To make our system small and portable, we implement the proposed system on a DSP platform. The DSP-based system could achieve as similar performance as the PC-based system under an appropriate lighting condition.
關鍵字(中) ★ 手型辨識
★ 人機互動
★ 控制家電關鍵字(英) ★ household appliance control
★ hand gesture recognition
★ Human Computer Interaction論文目次 中文摘要 ............................................................................................................ IV
ABSTRACT ......................................................................................................... V
誌 謝 ................................................................................................................. VII
目 錄 ................................................................................................................ VIII
圖目錄 List of Figures .......................................................................................... X
表目錄 List of Tables ....................................................................................... XIII
第一章、緒論 ..................................................................................................... 1
1-1 研究動機 .............................................................................................. 1
1-2 研究目的 .............................................................................................. 2
1-3 論文架構 .............................................................................................. 3
第二章、相關研究 ............................................................................................. 4
2-1 感應式手勢辨識 .................................................................................. 5
2-2 影像式手勢辨識 .................................................................................. 7
2-3 手勢應用於家電控制相關研究與產品 ............................................ 10
第三章、系統硬體介紹 ................................................................................... 13
3-1 系統架構 ............................................................................................ 13
3-2 介陎主控端介紹 ................................................................................ 15
3-3 家電主控端與家電接收端介紹 ........................................................ 17
第四章、研究方法與步驟 ............................................................................... 19
4-1 PC-Based與DSP-Based的影像輸入 .................................................. 20
4-2 十一種手型與兩種揮動手勢定義 .................................................... 20
4-3 手部偵測 ............................................................................................ 21
4-3-1 膚色偵測 ................................................................................. 21
4-3-2 侵蝕與擴張演算法 ................................................................. 23
4-3-3 標號演算法 ............................................................................. 25
4-4 手臂切割 ............................................................................................ 27
4-5 邊緣擷取 ............................................................................................ 29
4-6 輪廓描述 ............................................................................................ 30
4-7 特徵擷取 ............................................................................................ 31
4-8 辨識方法 ............................................................................................ 32
4-8-1 兩種揮動手勢判斷 ................................................................. 32
4-8-2 十一種手型辨識 ..................................................................... 33
4-9 DSP 實作 ............................................................................................ 38
第五章、實驗結果 ........................................................................................... 40
5-1 兩種手勢的辨識率分析 .................................................................... 40
5-2 十一種手型的辨識率分析 ................................................................ 41
5-2-1 實驗 1-1 : KNN 辨識結果 ..................................................... 41
5-2-2 實驗 1-2 : KNN-Decision Tree 辨識結果 .............................. 45
5-2-3 實驗 1-3 : 多層感知機辨識結果 ........................................... 48
5-3 家電控制情境實驗 ............................................................................ 51
第六章、結論與未來展望 ............................................................................... 53
6-1 結論 .................................................................................................... 54
6-2 未來展望 ............................................................................................ 54
參考文獻 ........................................................................................................... 56
附錄一 ............................................................................................................... 61
附錄二 ............................................................................................................... 62
附錄三 ............................................................................................................... 65
參考文獻 [1] TOSHIBA Co.. [Online] Available:
http://feminity.toshiba.co.jp/feminity/feminity_eng/about/index.html
May 2, 2011 [data accessed].
[2] W. Freeman and M. Roth, “Orientation Histograms for Hand Gesture Recognition,” in Proc. IEEE Workshop Automatic Face and Gesture Recognition, Jun. 1995.
[3] VRLOGIC Co.. [Online] Available: http://www.vrlogic.com/html/5dt/5dt_dataglove_5.html
Jun. 3, 2011 [data accessed].
[4] Measurand Inc.. [Online] Available: http://www.shapehand.com/shapehand.html June 3, 2011 [data accessed].
[5] K. Ouchi, N. Esaka, Y. Tamura, and M. Doi, “Magic Wand: An intuitive gesture remote control for home appliances,” in Proc. of International Conference on Active Media Technology, pp. 241, 2005.
[6] J.M. Rehg and T. Kanade, “Digiteyes: Vision-based hand tracking for human-computer interaction,” in Proc. of the Workshop on Motion of Non-Rigid and Articulated Bodies, pp. 16-22, 1994.
[7] E. Ueda, Y. Matsumoto, M. Imai, and T. Ogasawara, “Hand Pose Estimation for Vision-based Human Interface,” IEEE Trans. on Industrial Electronics, vol. 50, no. 4, pp. 676-684, 2003.
[8] A. Causo, M. Matsuo, E. Ueda, K. Takemura, Y. Matsumoto, J. Takamatsu, and T. Ogasawara, “Hand pose estimation using voxel-based individualized hand model,” in IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2009, pp. 451-456, 14-17 Jul. 2009.
[9] R. Yang and S. Sarkar, “Gesture Recognition Using Hidden Markov Models from Fragmented Observations,” in Proc. IEEE Conference Computer Vision and Pattern Recognition, 2006
[10] M. Elmezain and A. Al-Hamadi, “Gesture Recognition for Alphabets from Hand Motion Trajectory Using Hidden Markov Models,” 2007 IEEE International Symposium on Signal Processing and Information Technology, pp. 1192-1197, 15-18 Dec. 2007
[11] M. A. Amin and H. Yan, “Sign Language Finger Alphabet Recognition From Gabor-PCA Representation of Hand Gestures,” in Proc. of the Sixth International Conference on Machine Learning and Cybernetics, pp. 2218-2223, 2007.
[12] Y. Fang, J. Cheng, K. Wang, and H. Lu, “Hand Gesture Recognition Using Fast Multi-scale Analysis,” in Proc. of IEEE international Conference on Image and Graphics, 2007.
[13] M. Vafadar and A. Behrad, “Human Hand Gesture Recognition Using Motion Orientation Histogram for Interaction of Handicapped Persons with Computer,” Lecture Notes In Computer Science, vol. 5099, pp. 378-385, 2008.
[14] A. Bobick and A.Wilson, “A state based approach to the representation and recognition of gesture,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 19, no. 12, pp. 1325–1337, Dec. 1997
[15] J. F. Lichtenauer, E. A. Hendriks, and M. J. T. Reinders, “Sign Language Recognition by Combining Statistical DTW and Independent Classification,” IEEE Trans. On Pattern Analysis and Machine Intelligence, vol. 30, no. 11, pp. 2040-2046, 2008.
[16] M. V. Lamar, M. S. Bhuiyan, and A. Iwata, “T-CombNET - A Neural Network Dedicated to Hand Gesture Recognition,” Lecture Notes In Computer Science, vol. 1811, pp. 613-622, 2000.
[17] E. Stergiopoulou, N. Papamarkos, and A. Atsalakis, “Hand Gesture Recognition Via a New Self-organized Neural Network,” Lecture Notes In Computer Science, vol. 3773, pp. 891-904, 2005.
[18] P. Hong, M. Turk, and T. Huang, “Gesture modeling and recognition using finite state machines,” in Proc. Fourth IEEE International Conference and Gesture Recognition, pp. 410-415, 2000.
[19] M. A. Amin and H. Yan, “Sign Language Finger Alphabet Recognition From Gabor-PCA Representation of Hand Gestures,” in Proc. of the Sixth International Conference on Machine Learning and Cybernetics, pp. 2218-2223, 2007.
[20] 劉東樺,「以適應性膚色偵測與動態歷史影像為基礎之即時手勢辨識系統」,私立大同大學資訊工程學系碩士論文,民國九十八年。
[21] P. Premaratne and Q. Nguyen, “Consumer electronics control system based on hand gesture moment invariants,” Computer Vision, Institution of Engineering and Technology, vol.1, no.1, pp. 35-41, Mar. 2007.
[22] 王瑋群,「智慧型人性化家電人機介陎」,國立中央大學資訊工程學系碩士論文,民國九十八年。
[23] Engadget Co.. [Online] Available: http://chinese.engadget.com/2008/06/14/toshiba-qosmio-g55-features-spursengine-visual-gesture-controls/ May 2, 2011 [data accessed].
[24] Sotouch Co.. [Online] Available: http://www.so-touch.com/?id=software&content=air-presenter#/software/air-presenter May 2, 2011 [data accessed].
[25] Xbox Co.. [Online] Available:
http://www.xbox.com/en-US/kinect May 2, 2011 [data accessed].
[26] Evoluce AG Inc.. [Online] Available:
http://www.win-ni.com/ May 2, 2011 [data accessed].
[27] Texas Instruments Inc.. [Online] Available:
http://focus.ti.com/docs/toolsw/folders/print/tmdsvdp6437.html May 2, 2011 [data accessed].
[28] Texas Instruments Inc.. [Online] Available:
http://processors.wiki.ti.com/index.php?title=DM6437_EVM May 2, 2011 [data accessed].
[29] Texas Instruments Inc.. [Online] Available:
http://c6000.spectrumdigital.com/evmdm6437/reve/files/EVMDM6437_TechRef.pdf June 8, 2011 [data accessed].
[30] 鄧己正,「以視覺為基礎的人臉辨識理論」,國立中央大學資訊工程學系碩士論文,民國九十年。
[31] Soriano, M., B. Martinkauppi, S. Huovinen, and M. Laaksonen, “Using the skin locus to cope with changing illumination conditions in color-based face tracking,” in Proc. IEEE Nordic Signal Processing Symposium, Kolmar den, Sweden, pp. 383-386, Jul. 13-15, 2000.
[32] M. Hu, S. Worrall, A. Sadka, and A. Kondoz, “Face feature detection and model design for 2D scalable model-based video coding,” in Proc. International Conference on Visual Information Engineering, pp. 125-128, Jul. 7-9, 2003.
[33] 蘇芳生,「人臉表情辨識系統」,國立中正大學通訊工程學系碩士論文,民國九十三年。
[34] 曾郁展,「DSP-Based 之即時人臉辨識系統」,國立中山大學電機工程學系碩士論文,民國九十四年。
[35] 吳成柯、戴善榮、程湘君、雲立實,數位影像處理,儒林圖書有限公司,台北市,民國九十年十月。
[36] 林文章,「不同場景的膚色偵測與臉部定位」,國立中央大學電機工程學系碩士論文,民國九十八年。
[37] B. F. Wu, S. P. Lin, and C. C. Chiu, “Extracting characters from real vehicle licence plates out-of-doors,” Computer Vision, IET , vol.1, no.1, pp. 2-10, Mar. 2007.
[38] P. Viola and M. Jones, “Rapid Object Detection using a Boosted Cascade of Simple Features,” in Proc. of IEEE Computer Society Conference on Computer Vision an Pattern Recognition, pp. 511, 2001.
[39] M. K. Hu, “Visual pattern recognition by moment invariants,” IRE Transactions on Information Theory, vol.8, no.2, pp. 179-187, Feb. 1962.
[40] T. M. Cover and P. E. Hart, “Nearest neighbor pattern classification,” IEEE Trans. on Information Theory, vol.13, no.1, pp. 21- 27, Jan. 1967.
[41] Texas Instruments Inc.. [Online] Available:
http://focus.ti.com/lit/an/spraap3a/spraap3a.pdf
June 20, 2011 [data accessed].
[42] 蘇木春、張孝德,機器學習:類神經網路、模糊系統以及基因演算法則,修訂三版,全華圖書股份有限公司,台北市,民國九十三年。
指導教授 蘇木春(Mu-chun Su) 審核日期 2011-7-12 推文 plurk
funp
live
udn
HD
myshare
netvibes
friend
youpush
delicious
baidu
網路書籤 Google bookmarks
del.icio.us
hemidemi
myshare