博碩士論文 100522601 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:12 、訪客IP:18.217.208.72
姓名 巴圖佳(Batbaatar Battulga)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 利用 Kinect建置智慧型教室之評量系統
(A Kinect based Assessment System for Smart Classroom)
相關論文
★ 基於edX線上討論板社交關係之分組機制★ 利用Kinect建置3D視覺化之Facebook互動系統
★ 基於行動裝置應用之智慧型都會區路徑規劃機制★ 基於分析關鍵動量相關性之動態紋理轉換
★ 基於保護影像中直線結構的細縫裁減系統★ 建基於開放式網路社群學習環境之社群推薦機制
★ 英語作為外語的互動式情境學習環境之系統設計★ 基於膚色保存之情感色彩轉換機制
★ 一個用於虛擬鍵盤之手勢識別框架★ 分數冪次型灰色生成預測模型誤差分析暨電腦工具箱之研發
★ 使用慣性傳感器構建即時人體骨架動作★ 基於多台攝影機即時三維建模
★ 基於互補度與社群網路分析於基因演算法之分組機制★ 即時手部追蹤之虛擬樂器演奏系統
★ 基於類神經網路之即時虛擬樂器演奏系統★ 即時手部追蹤系統以虛擬大提琴為例
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) With the advancements of the Human Computer Interaction (HCI) field, nowadays it is possible for the users to use their body motions, such as swiping, pushing and moving, to interact with the content of computers or smart phones without traditional input devices like mouse and keyboard. With the introduction of gesture-based interface Kinect from Microsoft it creates promising opportunities for educators to offer students easier and more intuitive ways to interact with the system. The integration of Kinect based applications into classroom make students’ learning experience more active and joyful.
In this context, this master thesis proposes a system for assessment in Smart Classroom. We design an interactive framework using Microsoft Kinect Sensor for Virtual Learning Environment (VLE) with new gesture-based questions in supporting QTI-based assessment, and introduce a rich set of gesture commands for practical usage in the classroom. Proposed system was experimented with teachers and students then collected users’ feedback using a usability questionnaire. The results show that participants are satisfied with the system and indicate that the system is simple to use and provides better functionality and motivate student learning by assessment.
摘要(英) 隨著人機互動領域的進步,現今使用者可以透過各種手勢、身體語言、肢體動作來操作與偵測,不需要真實接觸到電腦鍵盤與滑鼠等智慧型3C產品,例如:智慧型手機等等,來操控投影螢幕有別於以往傳統的控制與展現方式。由於微軟Kinect的推出,透過手勢界面來操控傳統硬體方式與呈現的改變,它創造新穎的教學方式與使用,為學生提供更方便,更直觀的方式與系統作更有效率的互動。
整合基於Kinect應用於課堂中,此碩士論文中提出了一個評估式智慧型的教學應用,可以愉快且更多具經驗式的學習方法。我們使用微軟Kinect感測器的虛擬學習環境(VLE)支援和基於QTI評估與新的手勢的操控指令,我們設計了一個互動式的學習框架,並引進了一套豐富且新穎的手勢命令,實際使用在課堂上。我們提出的系統整合教師和學生的使用經驗,並收集他們的反饋和使用者提供的可用性的問卷。最後,結論展現出使用者與系統的評估達到高滿意度,並指出該系統是簡單易用,並提供了更好的功能和激勵學生的學習效果評估。
關鍵字(中) 關鍵字(英) ★ Kinect
★ Assessment
★ Question and Test Interoperability
★ Interactive
★ Natural User Interface
★ Gesture
★ Course Content
★ Assessment Authoring Tool
論文目次 摘 要 iii
ABSTRACT iv
LIST OF TABLES vii
LIST OF FIGURES viii
CHAPTER 1: INTRODUCTION 1
1.1. Background and motivation 1
1.2. Proposed solution 2
1.3. Structure of the thesis 2
CHAPTER 2: RELATED WORK 4
2.1. IMS Question and Test Interoperability 4
2.1.1 Specifications of Educational Technology for Assessment 4
2.1.2 Implementation and Experiments 5
2.2. Kinect and Naturel User Interface 20
2.2.1 History of Naturel User Interface and Kinect 20
2.2.2 Microsoft Kinect 21
2.2.3 Gesture and Speech Recognition 26
2.3. Smart Classroom Catting Edge Components 32
2.3.1 Helix 3D Toolkit 33
2.3.2 ASP.NET SignalR 34
2.3.3 EF and LINQ 36
2.4. Summary 37
CHAPTER 3: DESIGN AND IMPLEMENTATION 39
3.1 System Design and Architecture 39
3.1.1 General architecture 42
3.1.2 Gesture based Assessment and Interface 49
3.1.3 Database and Classes Structure 50
3.2 Implementation and experiments 53
3.2.1 Gesture based Interface 53
3.2.2 User Interface 59
3.2.3 Scenario for using our system 65
CHAPTER 4: EXPERIMENT AND EVALUATION 69
4.1 Gesture recognition 69
4.2 System Comparison and Test result 71
4.3 Evaluation users’ satisfaction 73
CHAPTER 5: CONCLUTION AND FUTURE WORK 75
5.1 Conclusion 75
5.2 Future research directions 75
APPENDIX 76
Appendix A. Students Guidelines 76
A.1. Kinect Guidelines [19] 76
A.2 Gesture for Assessment Guidelines 81
References 84
參考文獻 [1] QAed, "QAed (questions & assessments editor)," 2003. [Online]. Available: http://gti.upf.edu/leteos/newnavs/qaed.html.
[2] LdAuthor, "ReCourse," 2009. [Online]. Available: http://www.tencompetence.org/ldauthor/.
[3] GTI, "Iteractive technologies research group," 2011. [Online]. Available: http://gti.upf.edu.
[4] T. Navarrete, P. Santos, D. Hernández-Leo and J. Blat, "QTIMaps: A model to enable web-maps in assessment," Educational Technology & Society Journal, vol. 14, no. 3, pp. 203-217, 2011.
[5] D. Morillo, P. Santos, D. Perez, M. Ibáñez, C. Delgado and D. Hernández-Leo, "Assessment in 3D Virtual Worlds: QTI inWonderland," Congreso iberoamericano de informática educativa, vol. 1, pp. 410-417, 2010.
[6] GAST, "GAST research group," 2011. [Online]. Available: http://www.gast.it.uc3m.es/.
[7] Learn3, "Learn3 project," 2011. [Online]. Available: http://gti.upf.edu/learn3/.
[8] S. Spielberg, Director, Minority Report. [Film]. 2002.
[9] G. Roddenberry, Director, Star Trek. [Film]. 1966-2005.
[10] J. Shotton, A. Fitzgibbon, M. Cook, T. Sharp, M. Finocchio, R. Moore, A. Kipman and A. Blake, "Real-Time Human Pose Recognition in Parts from Single Depth Images," in Computer Vision and Pattern Recognition, Providence, RI, 2011.
[11] Dryad, "Stanford Virtual Worlds Group," [Online]. Available: http://dryad.stanford.edu/.
[12] Tell Me, "Microsoft Tellme speech innovation," [Online]. Available: http://www.microsoft.com/en-us/tellme/.
[13] Kinect for Windows SDK, "Voice, Movement & Gesture Recognition Technology," [Online]. Available: http://www.microsoft.com/en-us/kinectforwindows/.
[14] W. Jarrett and A. James, Beginning Kinect Programming with the Microsoft Kinect SDK, Apress; 1 edition (February 23, 2012), 2012.
[15] J. O. Wobbrock, A. D. Wilson and Y. Li, "Gestures without Libraries, Toolkits or Training: A $1 Recognizer for User Interface Prototypes," User interface software and technology, vol. 7, pp. 159-168, 2007.
[16] C. David, Programming with the Kinect™ for Windows® Software Development Kit, Microsoft Press, 2012.
[17] H.-C. Yang, Y.-T. Chang and T. K. Shih, "Using AJAX to build an on-line QTI based assessment system," International Conference on Computer Engineering and Applications, pp. 69-74, 2007.
[18] P.-L. Liu , T. Shih and Y.-L. Chen, "Developing QTI compliant assessment platform on digital TV," in IT in Medicine and Education, 2008.
[19] J. R. Lewisa, "IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use," International Journal of Human-Computer Interaction, pp. 57-78, 1995.
[20] Microsoft Corporation, Human Interface Guidelines v1.7, 2013.
[21] IMS Global Learning Consortium, "IMS Question & Test Interoperability Specification," 2001-2013. [Online]. Available: http://www.imsglobal.org/question/.
[22] APIS, "Assessment provision through interoperable segments," 2004. [Online]. Available: http://sourceforge.net/projects/apis/.
[23] N. Sclater and R. Cross, "What is... IMS question and test interoperability," 2003. [Online]. Available: http://zope.cetis.ac.uk/lib/media/qtibrief.pdf.
[24] R2Q2, "Rendering and response processing services for QTIv2 questions," 2006. [Online]. Available: http://www.r2q2.ecs.soton.ac.uk/.
[25] JISC, "JISC foundation," 2009. [Online]. Available: http://www.jisc.ac.uk/.
[26] Helix 3D toolkit, "Collection of custom controls and helper classes for WPF.," [Online]. Available: http://helixtoolkit.codeplex.com/.
[27] SignalR, "library for ASP.NET developers that simplifies the process of real-time web functionality," [Online]. Available: http://www.asp.net/signalr.
[28] J. Abhijit, Kinect for Windows SDK Programming Guide, Packt Publishing Ltd., 2012.
[29] J. M. Aguilar , ASP.NET SignalR, Krasis Consulting, S. L., 2013.
[30] T. C.T. Kuoa, R. Shadievb, W.-Y. Hwangb and i.-S. Chenc, "Effects of applying STR for group learning activities on learning performance in a synchronous cyber classroom," Computers & Education, p. 600–608, 2012.
[31] Microsoft Kinect SDK, "Voice, Movement & Gesture Recognition Technology," [Online]. Available: http://www.kinectforwindows.org/.
[32] Kinect for Windows Developer Toolkit, 2013. [Online]. Available: http://www.microsoft.com/en-us/kinectforwindows/develop/developer-downloads.aspx.
[33] C. David , "Kinect Toolbox," [Online]. Available: http://kinecttoolbox.codeplex.com/.
[34] C. Rutkas, B. Peek and D. Fernandez, "Coding4Fun Kinect Toolkit," [Online]. Available: http://c4fkinect.codeplex.com/.
[35] Y. Suo, N. Miyata, H. Morikawa, T. Ishida and Y. Shi, "Open Smart Classroom: Extensible and Scalable Learning System in Smart Space Using Web Service Technology," Knowledge and Data Engineering, IEEE Transactions on, pp. 814 - 828, 2009.
[36] L. A. Tomei, Learning Tools and Teaching Approaches Through Ict Advancements, Igi Global, 2012.
[37] S. B. Eom and J. B. Arbaugh, Student Satisfaction and Learning Outcomes in E-Learning: An Introduction to Empirical Research, Idea Group Inc (IGI), 2011.
[38] C.-Y. Chang, Y.-T. Chien, C.-Y. Chiang, M.-C. Lin and H.-C. Lai, "Embodying gesture-based multimedia to improve learning," British Journal of Educational Technology, p. E5–E9, 2013.
指導教授 施國琛(Timothy K. Shih) 審核日期 2013-7-11
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明