博碩士論文 955201073 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:70 、訪客IP:3.139.238.176
姓名 黃俊捷(Jhun-chieh Huang)  查詢紙本館藏   畢業系所 電機工程學系
論文名稱 互動雙足式機器人之設計與實現(I) 手勢辨識
(The design and realization of interactive biped robots(I) gesture recognition)
相關論文
★ 直接甲醇燃料電池混合供電系統之控制研究★ 利用折射率檢測法在水耕植物之水質檢測研究
★ DSP主控之模型車自動導控系統★ 旋轉式倒單擺動作控制之再設計
★ 高速公路上下匝道燈號之模糊控制決策★ 模糊集合之模糊度探討
★ 雙質量彈簧連結系統運動控制性能之再改良★ 桌上曲棍球之影像視覺系統
★ 桌上曲棍球之機器人攻防控制★ 模型直昇機姿態控制
★ 模糊控制系統的穩定性分析及設計★ 門禁監控即時辨識系統
★ 桌上曲棍球:人與機械手對打★ 麻將牌辨識系統
★ 相關誤差神經網路之應用於輻射量測植被和土壤含水量★ 三節式機器人之站立控制
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 「互動雙足式機器人之設計與實現」是由三位同學合力完成,共分為(A)手勢辨識、(B)雙足式機器人控制以及(C)互動演算法執行三個部分,而本論文乃針對(A)項目做研究。
本論文主要設計一個使用者與機器人互動之手勢辨識系統,使用者以手勢下達指令給主導(Master)與從屬(Slave)兩台stand alone的雙足式機器人,進行獨力搬運、握手、接力搬運、合作搬運之互動的表演或執行前進、後退、左右轉、伏地挺身等基本動作,不同的動、靜態手勢定義不同的指令。在靜態手勢辨識上,辨識之前先將手勢之手臂影像去除,保留手掌之影像資訊。接著以雷達掃描的方式將手勢轉換成角度距離曲線圖,判斷其手指數;再根據手勢手指間角度的差異,辨識相同手指數之不同手勢。在動態手勢辨識上,以先後指尖的位置差異或先後手掌像素的數量差值做為判斷的依據,若其差異大於所設定的閥值,則判定為手左右擺動或移遠移近的動態手勢,本論文總共完成可辨識的九種靜態手勢及四種動態手勢。
摘要(英) The study work “The design and realization of interactive biped robots” was completed by three members. Three members accomplish the following three tasks, respectively, (A) gesture recognition, (B) basic motions’ control of biped robots, and (C) execution of interactive algorithm for two robots. This thesis focuses on the part (A) gesture recognition.
The goal of this research is to design a gesture recognition system for human-robot interaction. The operator uses different gestures to command two stand-alone robots “Master” and “Slave” to perform basic motions and four interactive motions. The basic motions are walking forward and backward, turning right and left, shifting right and left, and doing push-up. The interactive motions include 1. Master robot transports the object from the initial position to the destination; 2. Two robots walk close to each other and shake hands; 3. Slave robot passes the object to Master robot, then Master robot transports the object to the destination; and 4. Two robots carry the object together and transport it to the destination. Each dynamic or static gesture is regarded as one command for robots. In static gesture recognition, the first step is to remove the arm of the hand and retain the palm before recognizing. The second step transforms the palm into the curve graph of the angle against to the distance between the center point and each contour point of the palm. By analyzing the curve graph, the number of fingers is determined. The third step is to identify different gestures which have the same number of fingers according to the angle between the recognized fingers. In dynamic gesture recognition, the variation of fingertip’s positions or of the pixel numbers of the palm between two successive images is compared. If the comparison difference is larger than a threshold, the gesture is identified as swinging right and left or moving backward and forward. This research can recognize nine static gestures and four dynamic gestures
關鍵字(中) ★ 影像處理
★ 手勢辨識
關鍵字(英) ★ image processing
★ gesture recognition
論文目次 第一章 緒論 1
1.1 研究動機與目的 1
1.2 文獻回顧 2
1.3 論文目標 2
1.4 本文架構 3
第二章 系統架構與實驗環境 4
2.1 系統架構 4
2.1.1 機器人端系統架構 4
2.1.2 電腦端系統架構 8
2.2 實驗環境簡介 8
2.3 開發環境與人機介面 9
2.4 系統流程 12
第三章 手勢辨識 14
3.1 選取手部區域 14
3.1.1 影像擷取 14
3.1.2 雜訊濾除 15
3.1.3 背景相減 16
3.1.4 膚色偵側 17
3.1.5 斷開閉合運算 19
3.1.6 連通物件法 22
3.2 去除手臂 23
3.2.1 記錄輪廓到重心的距離 23
3.2.2 閥值的設定 24
3.2.3 手指數的誤判 26
3.2.4 去除邊緣手臂 27
3.2.5 再次去除手臂 28
3.3 辨識靜態及動態手勢 30
3.3.1 靜態手勢 30
3.3.2 動態手勢 34
第四章 實驗結果 39
4.1 靜態手勢之辨識結果 39
4.2 動態手勢之辨識結果 50
4.3 手勢指揮 51
第五章 結論與未來展望 52
5.1 結論 52
5.2 未來展望 52
參考文獻 54
參考文獻 [1] N. Kohl and P. Stone, “Policy gradient reinforcement learning for fast quadrupedal locomotion,” The Proceedings of IEEE International Conference on Robotics and Automation, New Orleans, LA, April 26-May 1, 2004, pp. 2619-2624.
[2] L. Geppert, “Qrio, the robot that could,” IEEE Spectrum, vol. 41, pp. 34-37, 2004.
[3] M. Veloso, N. Armstrong-Crews, S. Chernova, E. Crawford, C. McMillen, M. Roth, and D. Vail, “A team of humanoid game commentators,” The Proceedings of IEEE-RAS International Conference on Humanoid Robots, Genoa, Italy, December 4-6, 2006, pp. 228-233.
[4] J. Chestnutt, M. Lau, G. Cheung, J. Kuffner, J. Hodgins, and T. Kanade, “Footstep planning for the Honda ASIMO humanoid,” The Proceedings of IEEE International Conference on Robotics and Automation, Barcelona, Spain, April 18-22, 2005, pp. 629-634.
[5] K. T. Tseng, W. F. Huang, and C. H. Wu, “Vision-based finger guessing game in human machine interaction,” The Proceedings of IEEE International Conference on Robotics and Biomimetics, Kunming, China, December 17-20, 2006, pp. 619-624.
[6] S. Wagner, B. Alefs, and C. Picus, “Framework for a portable gesture interface,” The Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition, April 10-12, 2006, pp. 275-280.
[7] L. Gupta and M. Suwei, “Gesture-based interaction and communication: automated classification of hand gesture contours,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 31, pp. 114-120, 2001.
[8] P. Premaratne and Q. Nguyen, “Consumer electronics control system based on hand gesture moment invariants,” IET on Computer Vision, vol. 1, pp. 35-41, 2007.
[9] K. Abe, H. Saito, and S. Ozawa, “Virtual 3-D interface system via hand motion recognition from two cameras,” IEEE Transactions on Systems, Man and Cybernetics, Part A, vol. 32, pp. 536-540, 2002.
[10] E. J. Holden and R. Owens, “Recognizing moving hand shapes,” The Proceedings of International Conference on Image Analysis and Processing, September 17-19, 2003, pp. 14-19.
[11] X. Yin and X. Zhu, “Hand posture recognition in gesture-based human-robot interaction,” The Proceedings of IEEE Conference on Industrial Electronics and Applications, May 24-26, 2006, pp. 1-6.
[12] 曹文潔 (王文俊教授 指導), “猜拳機,” 碩士論文, 中央大學電機工程所, 2007
[13] K. K. Kim, K. C. Kwak, and S. Y. Ch, “Gesture analysis for human-robot interaction,” The Proceedings of IEEE Conference on Advanced Communication Technology, vol. 32, pp. 1824-1827, February 20-22, 2006.
[14] A. Wu, M. Shah, and N. da Vitoria Lobo, “A virtual 3D blackboard: 3D finger tracking using a single camera,” The Proceedings of IEEE Conference on Automatic Face and Gesture Recognition, vol. 4, pp. 536-543, 2000.
[15] T. Niidome, K. Ishii, and R. Ishii, “Recognition of a hand shape for application in a human interface,” The Proceedings of IEEE Conference on Industrial Electronics Society, San Jose, USA, November 29-December 3, 1999, pp. 565-570.
[16] R. C. Gonzalez and R. E. Woods, Digital Image Processing, 2nd Edition, Prentice Hall, 2002.
[17] 黃文吉, C++ Builder與影像處理, 儒林圖書有限公司, 2006.
[18] 巫瑞永 (王文俊教授 指導), “互動雙足式機器人之設計與實現II – 雙足式機器人控制,” 碩士論文, 中央大學電機工程所, 2008.
[19] 余建良 (王文俊教授 指導), “互動雙足式機器人之設計與實現III –互動演算法執行,” 碩士論文, 中央大學電機工程所, 2008.
指導教授 王文俊(Wen-June Wang) 審核日期 2008-6-24
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明