博碩士論文 995202056 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:14 、訪客IP:3.17.162.247
姓名 林紘毅(Hung-Yi Lin)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 Kinect於英語互動學習系統之應用
(An Application of Kinect inAn English Interactive Learning System)
相關論文
★ 以Q-學習法為基礎之群體智慧演算法及其應用★ 發展遲緩兒童之復健系統研製
★ 從認知風格角度比較教師評量與同儕互評之差異:從英語寫作到遊戲製作★ 基於檢驗數值的糖尿病腎病變預測模型
★ 模糊類神經網路為架構之遙測影像分類器設計★ 複合式群聚演算法
★ 身心障礙者輔具之研製★ 指紋分類器之研究
★ 背光影像補償及色彩減量之研究★ 類神經網路於營利事業所得稅選案之應用
★ 一個新的線上學習系統及其於稅務選案上之應用★ 人眼追蹤系統及其於人機介面之應用
★ 結合群體智慧與自我組織映射圖的資料視覺化研究★ 追瞳系統之研發於身障者之人機介面應用
★ 以類免疫系統為基礎之線上學習類神經模糊系統及其應用★ 基因演算法於語音聲紋解攪拌之應用
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 本論文設計了兩個應用Kinect於英語互動學習的系統,第一個系統是擴增實境 (Augmented Reality) 系統,輔助老師的英語單字教學。本系統經由 Kinect 來抓取人體停止、前進、後退、左轉以及右轉的姿勢,透過無線射頻傳送指令到自走車,讓自走車行走於虛擬的英語拼單字遊戲板上,藉此學習英文單字,並且能夠讓孩童藉由遊戲來互相切磋比較,進而互相討論與學習,讓孩童能夠寓教於樂。本系統還藉由直覺化的操作,讓自走車控制能夠更符合人性。
第二個系統是機器人互動學習系統,希望藉由本系統來輔助老師上課時使用全肢體反應教學法(Total Physical Response)來教學。本系統由 Kinect 抓取人體的骨架資訊,透過動態時間校正 (Dynamic Time Warping, DTW) 或是隱藏式馬可夫模型 (Hidden Markov Model, HMM) 來判斷做的動作是否正確。本系統提供老師方便上課教學的編輯介面,並且可以讓學生透過老師的幫助,自行使用本系統讓學生能夠自行探索動作和英語的意義,藉由肢體動作以及口語表達出正確的英文語意,並且輔以機器人來互動學習,讓學習是由真實的口語以及動作來輔助學生學習英文的意義並非單純的填鴨式學習。
摘要(英) This thesis designed two applications on Kinect interactive English learning system; the first system is Augmented Reality system, aim to assist teachers in English vocabulary teaching. This system catch skeleton through Kinect for stop, forward, backward, turn left and right, through RF transmission commend to the cars. Cars can drive in virtual English word games to help student learn English words, and allows children to learn from each other by game, and then discuss with each other and learning so that children can be edutainment. This system also represents an intuition car control, can be more consistent with humanity.
The second system is Robot interactive learning system, to help teachers in class using the Total Physical Response. This system catch skeleton through Kinect, through Dynamic Time Warping or Hidden Markov Model to determine the action is correct or not. This system provides the teacher convenient teaching editing interface, and allows the students explore their own movements and English meaning himself. We supply the robot to interactive learning, so that learning in real-spoken language and action to assist students to learn English meaning is not simply spoon-fed learning.
關鍵字(中) ★ TPR教學
★ 動態時間校正
★ 隱藏式馬可夫模型
★ 擴增實境
★ 數位學習
關鍵字(英) ★ Dynamic Time Warping
★ Hidden Markov Model
★ Digital Learning
★ Augmented Reality
★ Total Physical Response
論文目次 中 文 摘 要 i
ABSTRACT ii
致 謝 iii
目錄 iv
圖目錄 vii
表目錄 x
第一章、緒論 1
1-1研究動機 1
1-2研究目的 2
1-3論文架構 3
第二章、相關研究 4
2-1數位學習 4
2-2全肢體反應教學 6
2-3 Kinect 相關應用 7
2-4 姿勢辨識相關應用 9
第三章、擴增實境系統 13
3-1系統架構介紹 13
3-2軟體以及介面介紹 14
3-2-1 Kinect 端軟體介紹 14
3-2-2 攝影機端軟體介紹 17
3-2-3 遊戲端軟體介紹 20
3-3硬體介紹 26
3-3-1無線射頻介紹 27
3-3-2電腦傳送端介紹 30
3-3-3自走車介紹 32
3-3-4紅外線燈板以及攝影機介紹 34
3-3-5 Fujitsu MB9B506R 介紹 35
3-4研究方法與步驟 39
3-4-1 手勢操作演算法 39
3-4-2 自走車定位演算法 42
3-5實驗 47
3-5-1 手勢實驗 47
3-5-2 操控性實驗 47
第四章、機器人互動學習系統 50
4-1系統架構介紹 50
4-2軟體以及介面介紹 51
4-2-1情境編輯模式 54
4-2-2動作編輯模式 55
4-2-3練習模式 59
4-2-4情境模式 60
4-2-5紀錄模式 62
4-3硬體介紹 63
4-4研究方法與步驟 65
4-4-1 骨架以及骨架的前處理 65
4-4-2 動態時間校正演算法 75
4-4-4 隱藏式馬可夫模型演算法 79
4-4-5 語音辨識 83
4-5實驗 84
4-5-1擷取實驗 85
4-5-3動態時間校正以及隱藏式馬可夫模型 87
第五章、結論暨未來展望 89
第六章、參考文獻 91
附錄一、各種動作圖像 96
參考文獻 [1] 黃國祐、謝莉卿,「國英雙語幼兒的詞彙學習策略和說話產生量」,朝陽人文社會學刊,民國九十四年。
[2] W. E. Merriman, “The mutual exclusivity bias in children’’s world learning: A reply to Woodward and Markman,” Developmental Review, vol. 11, pp. 164-191, 1991.
[3] 中華民國教育部 [Online] Available:
http://www.edu.tw/files/site_content/B0055/%E8%8B%B1%E8%AA%9E970526%E5%AE%9A%E7%A8%BF%EF%BC%88%E5%96%AE%E5%86%8A% EF%BC%89.pdf Jun. 10, 2012 [data accessed]
[4] R. T. Azuma, “A Survey of Augmented Reality,” Presence-Teleoperators and Virtual Environments, vol. 6, pp. 355-385, 1997.
[5] J. J. Asher, “The learning strategy of the total physical response: A review,” The Modern Language Journal, vol. 50, no. 2, pp. 79-84, 1966.
[6] 微軟Kinect for windows [Online] Available:
http://www.microsoft.com/en-us/kinectforwindows/ Jun. 10, 2012[data accessed]
[7] N. Hagbi, R. Grasset, O. Bergig, M. Billinghurst, and J. El-Sana, “In-Place Sketching for Content Authoring in Augmented Reality Games,” in Virtual Reality Conference of IEEE, pp. 91-94, 2010.
[8] C. M. Juan, G. Toffetti, F. Abad, and J. Cano, “Tangible Cubes Used as the User Interface in an Augmented Reality Game for Edutainment,” in Advanced Learning Technologies (ICALT), pp. 599-603, 2010.
[9] Y. J. Chang, C. H. Chen, W. T. Huang, and W. S. Huang, “Investigating Students’’ Perceived Satisfaction, Behavioral Intention, and Effectiveness of English Learning Using Augmented Reality,” in Multimedia and Expo (ICME), pp. 1-6, 2011.
[10] Y. Z. Hsieh, M. C. Su, S. Y. Chen, G. D. Chen, and S. C. Lin, “A Robot-Based Learning Companion for Storytelling,” in Proc. of the 18th International Conference on Computers in Education, Putrajaya, Malaysia, Nov. 29-Dec. 2, 2010.
[11] J. J. Asher, “The Total Physical Response Approach to Second Language Learning,” The Modern Language Journal, vol. 53, no. 1, pp. 3-17, 1969.
[12] 江明麗,「全肢體反應教學法對國小資源班智能障礙學生英語聽說能力之教學成效」,國立臺灣師範大學特殊教育學系身心障礙特教教學碩士論文,民國九十九年。
[13] Xbox360+Kinect官方網站 [Online] Available:
http://www.xbox.com/zh-TW/kinect/ Jun. 10, 2012 [data accessed]
[14] Waybeta [Online] Available:
http://www.waybeta.com/news/58230/microsoft-kinect-somatosensory-gamedevice-full-disassembly-report-_microsoft-xbox Jun. 10, 2012 [data accessed]
[15] I. T. Chiang and J. C. Tsai, “Using Xbox 360 Kinect Games on Enhancing Visual Performance Skills on Institutionalized Older Adults with Wheelchairs,” in Digital Game and Intelligent Toy Enhanced Learning, pp. 263-267, Takamatsu, Japan, March 27-30, 2012.
[16] X. Yu, L. Wu, Q. Lin, and H. Zhou, “Children tantrum behaviour analysis based on Kinect sensor,” in Intelligent Visual Surveillance (IVS), pp. 49-52, Beijing, China, Dec. 1-2, 2011.
[17] P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, “RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments,” international journal of robotics research, vol. 31, no. 5, pp. 647-663, 2012.
[18] J. Tong, J. Zohu, L. Lin, Z. Pan, and H. Yan, “Scanning 3D Full Human Bodies Using Kinects,” Visualization and Computer Graphics, vol. 18, no. 4, pp. 643-650, 2012.
[19] R. Xu, S. Zhou, and W.J. Li, “MEMS Accelerometer Based Nonspecific-User
Hand Gesture Recognition,” IEEE Sensors Council, vol. 12, no .5, pp. 1166-1173, 2012.
[20] J. Y. Hwang, J.M. Kang, Y. W. Jang and H. C. Kim, “Development of novel algorithm and real-time monitoring ambulatory system using bluetooth module for fall detection in the elderly,” in Proc. of the 26th Annual International Conference of the IEEE EMBS, pp. 2204-2207, San Francisco, USA, 2004.
[21] S. Tsuruta, Y. Kawauchi, W. Choi, and K. Hachimura, “Real-Time Recognition of Body Motion for Virtual Dance Collaboration System,” in Artificial Reality and Telexistence, 17th International Conference, pp.23-30, Esbjerg, Jylland, Nov. 28-30, 2007.
[22] K. Lai, J. Konrad, and P. Ishwar, “A gesture-driven computer interface using Kinect,” in Image Analysis and Interpretation (SSIAI), pp. 185-188, Santa Fe, NM, Apr. 22-24, 2012.
[23] S. Kumari and S. K. Mitra, “Human Action Recognition Using DFT,” in Computer Vision, Pattern Recognition, Image Processing and Graphics (NCVPRIPG), pp. 239-242, Hubli, Karnataka, Dec. 15-17, 2011.
[24] AM24L01BS-U datasheet [Online] Available:
http://read.pudn.com/downloads67/doc/fileformat/241962/AM24L01BS-UDataSheet-1.pdf Jun. 10, 2012 [data accessed]
[25] nRF24L01 datasheet [Online] Available:
http://www.nordicsemi.com/eng/nordic/download_resource/8041/1/30955977&ei=vZvVT6a7L8nXmAWA1_iAAw&usg=AFQjCNHMw5UHyWyjmmjMIiKDZ_Tjod4R6Q&sig2=UfftS09xBuPKRGxJ_s08sg Jun. 10, 2012 [data accessed]
[26] Fujitsu MB9Axxx/MB9Bxxx Series [Online] Available:
http://www.fujitsu.com/downloads/CN/fss/events/contest/2010/doc/MB9Bxxx-MN706-00002-1v0-E.pdf Jun. 12, 2012 [data accessed]
[27] Fujitsu MB9B506R datasheet [Online] Available:
http://www.fujitsu.com/downloads/CN/fss/events/contest/2010/doc/MB9BF500N-DS706-00010-0v01-E.pdf Jun. 12, 2012 [data accessed]
[28] ULINK-ME Debug Adapter [Online] Available:
http://www.keil.com/ulinkme/ Jun. 12, 2012[data accessed]
[29] 伍星翰,「線性加速度感測器的研究與其應用」,國立中央大學資訊工程學系碩士論文,民國一百年。
[30] M. C. Su, G. D. Chen, Y. S. Tsai, R. H. Yao, C. K. Chou, Y. B. Jinawi, D. Y. Huang, Y. Z. Hsieh, and S. C. Lin, “Design of an Interactive Table for Mixed-Reality Learning Environments,” in 4th International Conference on E-Learning and Games, Banff, Canada, Aug. 9-11, pp. 489-494, 2009.
[31] KONDO官網 [Online] Available:
http://www.kondo-robot.com/EN/ Jun. 12, 2012 [data accessed]
[32] SSC32 [Online] Available:
http://www.lynxmotion.com/p-395-ssc-32-servo-controller.aspx Jun. 12, 2012 [data accessed]
[33] X. Hui, “Similarity search and outlier detection in time series,” Ph.D. dissertation, Fudan Univ., Shanghai, China, 2005.
[34] L. R. Rabiner, A. Rosenberg, and S. Levinsen, “Considerations in dynamic time warping algorithms for discrete word recognition,” Speech and Signal Processing, vol. 26, no. 6, pp. 575-582, 1978.
[35] HMM source [Online] Available:
http://www.pudn.com/downloads91/sourcecode/math/detail348920.html Jun. 12, 2012 [data accessed]
[36] T. T. Julius and C. G. Rafael, “Pattern Recognition Principles,” 2nd edition, Addison-Wesley Publishing Company, London, 1981.
[37] 教學方法 [Online] Available:
http://content.edu.tw/senior/english/tp_tt/teachmethod/totalphysicalresponse.htm Jun. 19, 2012[data accessed]
指導教授 蘇木春(Mu-Chun Su) 審核日期 2012-7-25
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明