博碩士論文 110524017 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:9 、訪客IP:3.17.74.153
姓名 王天誠(Tien-Cheng Wang)  查詢紙本館藏   畢業系所 網路學習科技研究所
論文名稱 探討智慧回饋如何影響學習時眼動和觸控 操作的表現-以 Covid-19 快篩模擬為例
(An investigation of a touch-based eye-tracking system with smart feedback and its influences on learning - Simulation of Covid-19 rapid test system)
相關論文
★ 同步表演機器人之建構與成效評估★ 探討國小學童使用電子書多媒體註記系統結合註記分享機制對其學習行為與時間之影響
★ 先備知識對註記式多媒體電子書的影響研究:從個別環境到分享環境★ Facilitating EFL speaking and writing with peer-tutoring and storytelling strategies in authentic learning context
★ An investigation into CKEL-supported EFL learning with TPR to reveal the importance of pronunciation and interactive sentence making★ Investigation of Facilitating Physics Learning using Ubiquitous-Physics APP with Learning Map and Discussion Board in Authentic Contexts
★ 智慧互動SmartVpen在真實情境對於英文學習之影響★ 利用合作虛擬化的網絡設計輔助計算機網路學習
★ 探討擴展合作式多媒體認知理論和其對EFL聽力與口語能力之影響 - 結合動覺辨識和學習者設計內容之猜謎遊戲★ 在真實情境中利用智慧機制提升國小學生之外語口說及對話能力之評估
★ 探討在真實情境下教師回饋對學習認知與學習持續性之影響★ 註釋、對話代理和協作概念圖支持大學生議論文寫作和後設認知的培養
★ Developing and Validating the Questionnaire and Its Model for Sustainable and Scalable Authentic Contextual Learning Supported by Mobile Apps★ 探討個人化、情境化及社會化的智慧機制 輔助真實情境國小幾何學習與其對學習成效之影響
★ Investigation of smart mechanisms for authentic contextual learning with sensor and recognition technologies★ 在真實情境下結合圖像與位置辨識促進英文寫作
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 ( 永不開放)
摘要(中) 探討在使用觸控式模擬系統時眼睛與手部行為的互動關係是非常必要的。為了提 高有關 Covid-19 快篩程序知識的掌握,本研究調查了一個具有智慧回饋(smart feedback)機制和眼動追蹤技術的觸控式 Covid-19 快篩模擬系統對學習者表現的影響。
本研究根據科技接受模型以及早期在教育環境中對基於模擬訓練和眼動追蹤的研 究,設計了一個涉及 60 名大學生的實驗。參與者被隨機分配到兩組,實驗組使用帶有 智慧回饋機制(包含文字和聲音,並提供詳細解釋)的 Covid-19 快篩模擬系統,而控制 組使用僅具有基礎回饋(只有文字,僅顯示正確或錯誤)的模擬系統。
我們收集了學習者的眼動數據、操作數據及表現指標,並利用獨立樣本 t 檢定、 皮爾森相關分析、多元迴歸以及序列分析等統計方法進行數據分析。
結果顯示,實驗組的學習者在學習效果上顯著優於控制組,並對 Covid-19 模擬系 統的實用性和易用性表示了正面評價。眼動追蹤技術與觸控操作的結合提升了使用者 的互動直觀性,並減少了操作過程中的分心情況。而基於學習者行為的智慧回饋,則 增強了他們的自我反思與理解。
這些發現為理解不同的回饋機制如何影響 Covid-19 模擬系統中學習者的行為和表 現提供了寶貴的洞察。智慧回饋機制透過引導學習者的注意力並提供即時、詳細的回 饋,有助於學習者更有效、準確地進行決策。
摘要(英) Understanding eye movement with touching and listening while learning and practicing with a simulation system is necessary. This study examines the effects of the touch-based simulation system (TBSS) with smart feedback (SF) mechanisms on the learners′ eye movements, and learning behaviors and its impact on learners′ performance and engagement within procedural knowledge related to Covid-19 rapid testing.
It involved an experiment with sixty graduate students divided into experimental and control groups, where the experimental group used TBSS-SF for learning and performing tasks, and the control group used the TBSS with basic feedback (BF). The study evaluated the acceptance and learning behaviors of the participant towards TBSS, considering factors like usefulness, ease of use, mobility, accessibility, satisfaction, and intention to use.
The research framework comprised various components, including independent variables such as the type of feedback mechanism utilized, control variables like consistent learning material, and dependent variables encompassing learning behavior and achievement metrics. Data on learners′ eye movements, learning behaviors, and performance measures were collected and analyzed using statistical methods such as ANCOVA, independent sample t-tests, Pearson correlation, and lag sequential analysis.
The research found that smart feedback improved learning outcomes. Learners in the EG, which used the simulation system with smart feedback, demonstrated higher performance than CG. This suggests smart feedback can effectively guide learning, enhance comprehension, and improve performance. The study also indicates that smart feedback can lead to more efficient task completion.
The participants expressed positive attitudes regarding the usefulness and ease of use of the Covid-19 simulation system. They found the system to be beneficial in their learning process and reported a high level of satisfaction with its functionality. These findings provide valuable insights into how different feedback mechanisms can influence user behavior and performance in the Covid-19 simulation system. The smart feedback mechanism seems to facilitate a more efficient and accurate decision-making process by guiding the user′s attention and providing immediate, detailed feedback.
關鍵字(中) ★ 眼動追蹤
★ 觸控式學習
★ 智慧回饋
★ 手眼協調
關鍵字(英) ★ Eye-tracking
★ Touch-based learning
★ Smart feedback mechanism
★ Hand-eye coordination
論文目次 List of Contents
中文摘要 .....................................................................................................................................i
Abstract .....................................................................................................................................ii Chapter 1 Introduction ............................................................................................................1 1.1 Research Background and Motivation .........................................................................1
1.2 Theoretical support.......................................................................................................3 1.2.1 Extended cognitive theory of multimedia learning ...........................................3 1.2.2 Human-computer interaction.............................................................................4
1.3 Research Question ........................................................................................................5
Chapter 2 Literature Review...................................................................................................7
2.1 Eye-tracking in learning with simulation system .........................................................7 2.2 Eye-tracking with Touch-based Manipulation for Procedure knowledge....................8 2.3 The Effect of smart feedback for eye-tracking with touch-based learning ..................9
Chapter 3 System Design and Implementation ...................................................................12
3.1 Touch-based Covid-19 Simulation System ................................................................12
3.2 Eye-tracking System...................................................................................................19
Chapter 4 Methodology .........................................................................................................21
4.1 Participants .................................................................................................................21
4.2 Research Framework ..................................................................................................22
4.2.1 Independent Variables .....................................................................................23
4.2.2 Control Variables.............................................................................................23
4.2.3 Dependent Variables........................................................................................23
4.3 Experimental Procedure .............................................................................................24
4.4 The difference between smart feedback and basic feedback......................................25
4.5 Data Analysis Approach.............................................................................................25
Chapter 5 Results analysis and discussion ...........................................................................26
5.1 Learning achievements between two groups..............................................................26 iii
5.2 Comparison of Learning Behaviors between Groups.................................................28 5.3 Relationship between Learning Behaviors and Learning Achievements for EG.......32 5.4 Prediction of the Dependent Variables to total manipulation time ............................35 5.5 Comparison of eye-tracking analysis between two groups ........................................35
5.5.1 Heat map..........................................................................................................36
5.5.2 Lag sequential analysis of gaze plot................................................................40 5.6 Comparison of touch plot between two groups..........................................................43 5.7 The interaction of eye movement and manipulation ..................................................45 5.8 Perception of Learners Toward our Proposed System ...............................................50 5.9 Suggestion and Implication ........................................................................................53
Chapter 6 Conclusion.............................................................................................................55 Reference .................................................................................................................................58 Appendix A: Pretest ...............................................................................................................61 Appendix B: Posttest ..............................................................................................................62 Appendix C: Questionnaire ...................................................................................................63
參考文獻 Land, M., Mennie, N., & Rusted, J. (1999). The roles of vision and eye movements in the control of activities of daily living. Perception, 28(11), 1311-1328.
Desmurget, M., Pélisson, D., Rossetti, Y., & Prablanc, C. (1998). From eye to hand: planning goal-directed movements. Neuroscience & Biobehavioral Reviews, 22(6), 761-788.
Ballard, D. H., Hayhoe, M. M., Li, F., & Whitehead, S. D. (1992). Hand-eye coordination during sequential tasks. Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences, 337(1281), 331-339.
Lee, H. W. (2015). Does touch-based interaction in learning with interactive images improve students’ learning? The Asia-Pacific Education Researcher, 24, 731-735.
Paek, S., Hoffman, D. L., & Black, J. B. (2013, June). Using touch-based input to promote student math learning in a multimedia learning environment. In EdMedia+ Innovate Learning (pp. 314-323). Association for the Advancement of Computing in Education (AACE).
Yağmur, S., & Çakır, M. P. (2016). Usability evaluation of a dynamic geometry software mobile interface through eye tracking. In Learning and Collaboration Technologies: Third International Conference, LCT 2016, Held as Part of HCI International 2016, Toronto, ON, Canada, July 17-22, 2016, Proceedings 3 (pp. 391-402). Springer International Publishing.
Hwang, W. Y., Manabe, K., & Huang, T. H. (2023). Collaborative guessing game for EFL learning with kinesthetic recognition. Thinking Skills and Creativity, 48, 101297.
Lankes, M., & Stiglbauer, B. (2016). GazeAR: Mobile gaze-based interaction in the context of augmented reality games. In Augmented Reality, Virtual Reality, and Computer Graphics: Third International Conference, AVR 2016, Lecce, Italy, June 15-18, 2016. Proceedings, Part I 3 (pp. 397-406). Springer International Publishing.
Shute, V. J. (2008). Focus on formative feedback. Review of educational research, 78(1), 153-189.
Cutumisu, M., Turgeon, K. L., Saiyera, T., Chuong, S., González Esparza, L. M., MacDonald, R., & Kokhan, V. (2019). Eye tracking the feedback assigned to undergraduate students in a digital assessment game. Frontiers in psychology, 10, 1931.
Conceição, W., Holanda, M., Macedo, F., Ishikawa, E., Nunes, V. T., & Da Silva, D. (2022, October). Automatic Feedback in the Teaching of Programming in Undergraduate
58
Courses: a Literature Mapping. In 2022 IEEE Frontiers in Education Conference
(FIE) (pp. 1-9). IEEE.
Duchowski, A. T., & Duchowski, A. T. (2017). Eye tracking methodology: Theory and
practice. Springer.
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J.
(2011). Eye tracking: A comprehensive guide to methods and measures. OUP
Oxford.
Godfroid, A., Boers, F., & Housen, A. (2013). An eye for words: Gauging the role of attention
in incidental L2 vocabulary acquisition by means of eye-tracking. Studies in Second
Language Acquisition, 35(3), 483-517.
Rayner, K. (1998). Eye movements in reading and information processing: 20 years of
research. Psychological bulletin, 124(3), 372.
Jarodzka, H., Scheiter, K., Gerjets, P., & Van Gog, T. (2010). In the eyes of the beholder:
How experts and novices interpret dynamic stimuli. Learning and
instruction, 20(2), 146-154.
Grant, E. R., & Spivey, M. J. (2003). Eye movements and problem solving: Guiding attention
guides thought. Psychological Science, 14(5), 462-466.
Sun, J. C. Y., & Hsu, K. Y. C. (2019). A smart eye-tracking feedback scaffolding approach to improving students′ learning self-efficacy and performance in a C programming course. Computers in Human Behavior, 95, 66-72.
Yağmur, S. (2014). Usability evaluation of dynamic geometry software through eye tracking and communication breakdown analysis (Master′s thesis, Middle East Technical University).
Johansson, R. S., Westling, G., Bäckström, A., & Flanagan, J. R. (2001). Eye–hand coordination in object manipulation. Journal of neuroscience, 21(17), 6917-6932.
Bogossian, F. E., Cooper, S. J., Cant, R., Porter, J., Forbes, H., & FIRST2ACTTM Research Team. (2015). A trial of e-simulation of sudden patient deterioration (FIRST2ACT WEBTM) on student learning. Nurse education today, 35(10), e36-e42.
Van der Kleij, F. M., Feskens, R. C., & Eggen, T. J. (2015). Effects of feedback in a computer-based learning environment on students’ learning outcomes: A meta-analysis. Review of educational research, 85(4), 475-511.
Ma, J., Li, J., & Gong, Z. (2022). Evaluation of driver distraction from in-vehicle information systems: A simulator study of interaction modes and secondary tasks classes on eight production cars. International Journal of Industrial Ergonomics, 92, 103380.
Graesser, A. C., Lu, S., Jackson, G. T., Mitchell, H. H., Ventura, M., Olney, A., & Louwerse,
59

M. M. (2004). AutoTutor: A tutor with dialogue in natural language. Behavior
Research Methods, Instruments, & Computers, 36, 180-192.
Oviatt, S. (2006, October). Human-centered design meets cognitive load theory: designing
interfaces that help people think. In Proceedings of the 14th ACM international
conference on Multimedia (pp. 871-880).
Khor, W. S., Baker, B., Amin, K., Chan, A., Patel, K., & Wong, J. (2016). Augmented and
virtual reality in surgery—the digital surgical environment: applications, limitations
and legal pitfalls. Annals of translational medicine, 4(23).
Wu, P. H., Hwang, G. J., Milrad, M., Ke, H. R., & Huang, Y. M. (2012). An innovative
concept map approach for improving students′ learning performance with an instant
feedback mechanism. British Journal of Educational Technology, 43(2), 217-232. Issa, T., & Isaias, P. (2022). Usability and Human–Computer Interaction (HCI).
In Sustainable Design: HCI, Usability and Environmental Concerns (pp. 23-40).
London: Springer London.
Jingjing, L., & Qinglong, Z. (2010, October). Design of model for activity-centered web
learning and user experience. In 2010 International Conference on Artificial
Intelligence and Education (ICAIE) (pp. 301-304). IEEE.
Hollender, N., Hofmann, C., Deneke, M., & Schmitz, B. (2010). Integrating cognitive load
theory and concepts of human–computer interaction. Computers in human
behavior, 26(6), 1278-1288.
Lohse, K. R., Boyd, L. A., & Hodges, N. J. (2016). Engaging environments enhance motor
skill learning in a computer gaming task. Journal of motor behavior, 48(2), 172-
182.
Wiebe, E. N., Minogue, J., Jones, M. G., Cowley, J., & Krebs, D. (2009). Haptic feedback and students’ learning about levers: Unraveling the effect of simulated
touch. Computers & Education, 53(3), 667-676.
Jin, X., Zhao, Y., Bian, H., Li, J., & Xu, C. (2023). Sunflower seeds classification based on self-attention Focusing algorithm. Journal of Food Measurement and Characterization, 17(1), 143-154.
Conati, C., Jaques, N., & Muir, M. (2013). Understanding attention to adaptive hints in educational games: an eye-tracking study. International Journal of Artificial Intelligence in Education, 23, 136-161.
Holzinger, A. (2002, October). Finger instead of mouse: touch screens as a means of enhancing universal access. In ERCIM Workshop on User Interfaces for all (pp.387-397). Berlin, Heidelberg: Springer Berlin Heidelberg.
指導教授 黃武元(Wu-Yuin Hwang) 審核日期 2023-7-8
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明