博碩士論文 104522050 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:15 、訪客IP:3.144.121.155
姓名 鄭羽婷(Cheng, Yu-Ting)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 使用Leap Motion進行即時布偶操作
(Using Leap Motion for Manipulating Real-time Digital Puppet Models)
相關論文
★ 基於edX線上討論板社交關係之分組機制★ 利用Kinect建置3D視覺化之Facebook互動系統
★ 利用 Kinect建置智慧型教室之評量系統★ 基於行動裝置應用之智慧型都會區路徑規劃機制
★ 基於分析關鍵動量相關性之動態紋理轉換★ 基於保護影像中直線結構的細縫裁減系統
★ 建基於開放式網路社群學習環境之社群推薦機制★ 英語作為外語的互動式情境學習環境之系統設計
★ 基於膚色保存之情感色彩轉換機制★ 一個用於虛擬鍵盤之手勢識別框架
★ 分數冪次型灰色生成預測模型誤差分析暨電腦工具箱之研發★ 使用慣性傳感器構建即時人體骨架動作
★ 基於多台攝影機即時三維建模★ 基於互補度與社群網路分析於基因演算法之分組機制
★ 即時手部追蹤之虛擬樂器演奏系統★ 基於類神經網路之即時虛擬樂器演奏系統
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 人機互動議題近幾年來尤為盛行,他也是未來離不開我們生活的一項技術,人類和電腦互動交流,只會更多不會更少。本論文提出即時虛擬布偶操作,希望將逐漸沒落的布偶戲,結合科技,展現全新的風貌,解決布偶戲高昂的戲偶舞台的支出,並且達到一人也可以輕鬆完成布偶戲的製作目的。我們採用了leap motion 作為我們手部偵測的唯一sensor,追蹤手部並且偵測特定動作,並將手指mapping 虛擬布偶的雙手和頭部,利用逆向運動學計算子關節旋轉角度,模擬連動關節自然姿態,呈現手部操作虛擬布偶的成果。再多樣舞台,提供不同的互動物件,例如開關門,或是利用特定手勢只是玩偶拾取物品等,讓布偶操作有更多的可能性。結果呈現有達到預期效果,也拍攝了示範影片,並且以簡單的設備和易上手的操作使用本系統。
摘要(英) Recent years, people it is urgent to look for a new method to interact directly with device. If you only use keyboard and mouse to control the device, you always feel indirect and not intuitive. You will want to interact more naturally with computer. So the Human Computer Interaction (HCI) has become an important issue of computer science. The VR gives priority to the development in the future. The HCI makes people communicate with computers easier. The goal is presenting more interactive experience as better as possible.
Puppet play is often seen in many countries as a type of traditional show, which is a form of theater or performance that involves the control of puppets. Facing the audience, the actors rely on the story outline and add their own creative improvisation to complete a perfect performance. The PiLi glove puppetry is the most important one in Taiwan. It had been great memory of childhood of people. Recently, this traditional show is becoming less popular due to their high cost such as professional stage, expensive puppets and the preparation time for a show. If you want to start a show, you should setup the stage and prepare many kinds of puppets. This is very difficult to those who are the beginner. Young people always like to fashionable things. They don’t want to play this.
In this paper, we combine the puppet play and technology. We propose a method that user can play a puppet show on computer. You try to design a simply environment to provide to those who want to play the puppet play but don’t have enough equipment. The Leap Motion is an only sensor we use and it catches user’s hand gesture and shows result of controlling puppet in computer screen. The Unity is our performance of vision and we use Maya to design our model. We use finger mapping hands and head of puppet and calculate angle of joint by Inverse kinematics. Users can choose different puppets, scenes and music in our system. Anyone can play a real-time puppet show easily by themselves.
關鍵字(中) ★ 布偶秀
★ 手部追蹤
★ 手勢
★ 及時
關鍵字(英) ★ Puppet show
★ hand tracking
★ Gesture
★ Leap Motion
★ Inverse kinematics
★ Real-Time
論文目次
摘要 i
Abstract ii
Acknowledgements iv
Contents v
List of Figures vii
List of Tables x
Chapter 1. Introduction 1
1.1 Background 1
1.2 Motivation 4
1.3 Thesis Organization 5
Chapter 2. Related Work 6
2.1 Digital puppet show 6
2.2 The Leap Motion 9
2.2.1 Introduction 9
2.2.2 Observations 12
2.2.3 Finger data for Leap Motion 14
2.2.4 Features data for Leap Motion 16
2.3 The Leap Motion with HCI 17
2.4 The Leap Motion with the Unity 19
2.5 Inverse kinematics and forward kinematics 23
2.6 Our System 27
Chapter 3. Proposed Method 28
3.1 Finger and palm data 28
3.2 Control puppets 31
3.3 Different types of the result 45
3.4 The puppets and scenes 48
3.5 System pseudocode 49
Chapter 4. Experiment Results 50
4.1 Equipment 50
4.2 Inverse kinematics result 52
4.3 result of drama 53
Chapter 5. Conclusion and Future Work 58
5.1 Conclusion 58
5.2 Future Work 58
References 59
參考文獻

[1] Baird, Bil. The art of the puppet. Macmillan, 1965.
[2] Hsieh, Chun-pai. The Taiwanese hand-puppet theatre: A search for its meaning. Diss. Brown University, 1991.
[3] Matusky, Patricia Ann. Malaysian shadow play and music: continuity of an oral tradition. Oxford University Press, USA, 1993.
[4] Von Kleist, Heinrich, and Thomas G. Neumiller. "On the marionette theatre." The drama review: TDR (1972): 22-26.
[5] G. Marin, F. Dominio, and P. Zanuttigh, “Hand gesture recognition with jointly calibrated leap motion and depth sensor,” Multimedia Tools Appl., pp. 1–25, 2015, doi: 10.1007/s11042-015-2451-6.
[6] Minto, L., and P. Zanuttigh, "Exploiting silhouette descriptors and synthetic data for hand gesture recognition," (2015).
[7] A. Kurakin, Z. Zhang, and Z. Liu, “A real time system for dynamic hand gesture recognition with a depth sensor,” in Proc. 20th Eur. Conf. Signal Process. , 2012, pp. 1975–1979.
[8] Piman, Sirot, and Abdullah Zawawi Talib. "Puppet modeling for real-time and interactive virtual shadow puppet play." Digital Information and Communication Technology and it′s Applications (DICTAP), 2012 Second International Conference on. IEEE, 2012.
[9] Ghani, Dahlan Bin Abdul. "Seri Rama: Converting a shadow play puppet to street fighter." IEEE computer graphics and applications 32.1 (2012): 8-11.
[10] Talib, Abdullah Zawawi, et al, "Traditional shadow puppet play–the virtual way," International Conference on Intelligent Technologies for Interactive Entertainment. Springer Berlin Heidelberg, 2011.
[11] Chen Xuan, Zhang Mingmin, Pan Zhigeng, “Interaction and animation simulation of digital shadow play,” Journal of Image and Graphics, 2014, 19(10):1490-1499
[12] Yan Wen, Chuanyan Hu, Guanghui Yu, Changbo Wang, “A robust method of detecting hand gestures using depth sensors”, 2012 IEEE International Workshop on Haptic Audio Visual Environments and Games (HAVE), 2012, pp. 72-77
[13] Lee, Po-Wei, et al. "TranSection: hand-based interaction for playing a game within a virtual reality game." Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2015.
[14] Chophuk, P., et al. "Hand postures for evaluating trigger finger using leap motion controller." Biomedical Engineering International Conference (BMEiCON), 2015 8th. IEEE, 2015.
[15] Khademi, Maryam, et al. "Free-hand interaction with leap motion controller for stroke rehabilitation." Proceedings of the extended abstracts of the 32nd annual ACM conference on Human factors in computing systems. ACM, 2014.
[16] Plemmons, Daniel, and David Holz. "Creating next-gen 3D interactive apps with motion control and Unity3D." ACM SIGGRAPH 2014 Studio. ACM, 2014.
[17] Ji, Yi, et al. "3D Hand Gesture Coding for Sign Language Learning." Virtual Reality and Visualization (ICVRV), 2016 International Conference on. IEEE, 2016.
指導教授 施國琛 審核日期 2017-7-12
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明