博碩士論文 107522049 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:39 、訪客IP:3.137.171.147
姓名 王政鑫(Cheng-Hsin Wang)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 探討聊天機器人輔助程式教學系統與問題解決教學融入STEM程式課程的學習感受、學習行為與學習成效之研究
(Study on Learning Perceptions, Behaviors, and Effectiveness of Chatbot Assisted Programming Teaching System with Problem-Solving Teaching in STEM Programming Courses)
相關論文
★ 觀看LINE平台上教學影片行為模式對學生的使用偏好、學習動機及學習成效之影響: 以網路程式課程為例★ 探討延展實境與運算思維融入數學與化學課程的學習感受、學習行為與學習成效之研究
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 ( 永不開放)
摘要(中) STEM程式已成為國內大專院校必修課程,而問題解決教學法是適合培養學員的計算思維能力。本研究使用聊天機器人輔助程式教學系統與問題解決教學法融入STEM程式課程來提升學習效益,探討學生的學習成效、學習感受以及學習行為之間差異。
實驗對象為北部大專院校學生,實驗主題為程式設計課程。實驗活動之後,蒐集測驗、問卷、訪談以及系統平台的操作紀錄等數據。依據前測將受試者分為三組進行比較學習成效差異,分析學習感受與學習行為間的相關性。接著,使用機器學習技術分析學習感受與學習行為對答題驗證的結果。
實驗結果發現大部分學生對於程式設計有一定程度的背景知識,部分學生注重實作能力,並不在意理論知識,所以成績提升有限;低先備與高先備知識組認知過程相似,學習感受問卷的成就、組織、參考等向度與學習行為有顯著的相關性,成就向度分數高的會重複觀看題目並提交程式碼驗證,組織向度分數高的會嘗試不同解題方法,參考向度分數高的課程隨堂測驗表現較好;在此研究中,監督式學習中SVM表現最好,其F1 score為0.758。使用與學習感受問卷向度相關性高的特徵進行分類後,其F1 score為0.74,這表示說可以藉由去除相關性較低的特徵實現資料降維,發現最好的STEM程式課程的教育資料探勘方法。
摘要(英) STEM programming education has become a required course in domestic universities, and problem-solving instructional methods are suitable for cultivating students′ computational thinking ability. This study integrates chatbot-assisted instructional systems and problem-solving instructional methods into the STEM programming course to enhance learning performance, and we investigate the differences in students′ learning effectiveness, perceptions, and behaviors.
The experimental subjects are college students in Northern Taiwan, and the theme of the experiment is the STEM programming course. After the experiment, we collected data from tests, questionnaires, interviews, and system logs. According to the pre-test, the students were divided into three groups to compare the differences in learning effectiveness and analyzed the correlation between learning perceptions and behaviors. Next, we analyze the learning perceptions and learning behaviors using machine learning techniques to compare the results of verifying answers.
Results show that most students have a certain degree of background knowledge in STEM programming. Moreover, some students focus on practical abilities and do not care about theoretical knowledge, so their programming performance is limited. The cognitive process of low and high prior knowledge groups is similar. In the learning perception questionnaire, the effort, organization, and reference dimensions of students are significantly correlated with students′ learning behaviors. The high effort score of students can repeatedly improve the programming questions and submit the answer several times for validation. The high organization score of students can try different solutions, and the high reference score of students can get better programming performance in quizzes. In this study, the SVM performs the best in all machine learning algorithms, and the F1 score of SVM is 0.758. After classifying the highly correlated features on the learning perception questionnaire, the F1 score of SVM is 0.74. This finding indicated that the ideal dimension reduction is realized by removing low correlated features on learning perceptions. Finally, we find the best educational data mining approach in STEM programming courses.
關鍵字(中) ★ 聊天機器人
★ STEM
★ 問題解決教學方法
★ 學習行為
★ 學習成效
★ 學習感受
關鍵字(英) ★ Chatbot
★ STEM
★ problem-solving teaching
★ learning behavior
★ learning effectiveness
★ learning perceptions
論文目次
摘  要 i
Abstract ii
誌謝 iv
目  錄 v
圖目錄 viii
表目錄 ix
第一章 緒論 1
1.1研究背景 1
1.2目的 2
1.3論文架構 2
第二章 文獻探討 3
2.1 程式教育 3
2.2 STEM程式教育 5
2.3 問題解決教學融入程式教育 6
2.4 聊天機器人融入程式教育 8
2.5教學評量 9
2.6教育資料探勘 11
2.7研究問題 13
第三章 系統設計 14
3.1問題解決程式教學平台系統設計 14
3.2 問題解決程式教學平台系統結合現有程式教學系統 17
3.3實作課程教學平台系統開發 19
3.4 LINE程式教學機器人系統開發 21
3.5問題解決程式教學平台系統資料庫設計 22
第四章 研究方法 23
4.1 實驗對象 23
4.2 實驗教材 23
4.3 實驗程序 25
4.4 實驗工具 27
4.4.1 測驗試卷 27
4.4.2 STEM學習感受問卷 28
4.4.3 實作課程教學平台系統使用感受問卷 29
4.4.4 LINE程式教學機器人系統使用感受問卷 30
4.4.5 課程分組訪談 30
4.5 數據搜整編碼 31
4.6學習行為數據分析 32
4.6.1 監督式學習 33
4.6.2 非監督式學習 35
4.6.3 機器學習分類評估指標 36
第五章 研究結果 38
5.1 學生先備知識學習成效分析 38
5.2 LINE平台與實作課程平台的使用感受差異 41
5.3 學生學習行為與學習感受 42
5.3.1 學習感受問卷描述性統計 43
5.3.2 ITSA提交次數統計 44
5.3.3 學習行為紀錄與學習感受問卷間的關聯性 45
5.4 學習行為紀錄訓練集與測試集的分割比例 47
5.4.1 Week1學習行為紀錄數據訓練集與測試集的分割比例 47
5.4.2 Week2學習行為紀錄數據訓練集與測試集的分割比例 49
5.4.3 Week3學習行為紀錄數據訓練集與測試集的分割比例 51
5.4.4 Week4學習行為紀錄數據訓練集與測試集的分割比例 53
5.4.5 Week5學習行為紀錄數據訓練集與測試集的分割比例 55
5.5 機器學習分類結果 57
5.6 訪談結果 61
第六章 結論與建議 62
6.1結論與討論 62
6.2建議 64
參考文獻 65
英文部分 65
中文部分 72
附錄一、問題解決程式教學平台系統資料庫表格 74
附錄二、研究實驗同意書 81
附錄三、資料型態與輸入輸出實作課程教材 82
附錄四、遞迴與函式實作課程教材 84
附錄五、指標與動態記憶體實作課程教材 87
附錄六、動態規劃實作課程教材 89
附錄七、類別繼承實作課程教材 91
附錄八、學習成效前後測 93
附錄九、STEM學習感受問卷 95
附錄十、實作課程教學平台系統使用感受問卷 97
附錄十一、LINE 程式教學機器人使用感受問卷 98
參考文獻 Australian Curriculum, Assessment and Reporting Authority [ACARA]. (2013). Draft Australian Curriculum: Technologies. Retrieved May 27, 2020, from https://docs.acara.edu.au/resources/Draft_Australian_Curriculum_Technologies_-_Consultation_Report_-_August_2013.pdf.
Achilleos, A.P., Mettouris, C., Yeratziotis, A., Papadopoulos, G.A., Pllana, S., Huber, F., Jäger, B., Leitner, P. Ocsovszky, Z., & Dinnyés, A. (2019). SciChallenge: A social media aware platform for contest-based STEM education and motivation of young students. IEEE Transactions on Learning Technologies, 12(1), 98-111. doi: 10.1109/TLT.2018.2810879
All, A., Plovie, B., Castellar, E. P. N., & Van Looy, J. (2017). Pre-test influences on the effectiveness of digital-game based learning: A case study of a fire safety game. Computers & Education, 114, 24-37.
Araque, F., Roldan, C., & Salguero, A. (2019). Factors influencing university drop out rates. Computer & Education, 53, 563-574. doi: 10.1016/j.compedu.2009.03.013
Baker, R. (2012). Data mining for education. In B. Mcgaw, P. Peterson & E. Baker (Eds.), International encyclopedia of education (3rd ed.), UK: Oxford.
Bransford, J. D., & Stein, B. S. (1993). The ideal problem solver: A guide for improving thinking, learning, and creativity (2nd ed.). New York: W.H. Freeman.
Chatisfy. (2020). Chatisfy. Retrieved May 27, 2020, from https://www.chatisfy.com/
Computer Science Teachers Association [CSTA]. (2017). K-12 Computer Science Standards. Retrieved May 27, 2020, from https://www.doe.k12.de.us/cms/lib/DE01922744/Centricity/Domain/176/CSTA%20Computer%20Science%20Standards%20Revised%202017.pdf
Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika 16, 297–334. doi: 10.1007/BF02310555
Costelloe, E. (2004). Teaching programming the state of the art. Retrieved May 27, 2020, from https://www.scss.tcd.ie/disciplines/information_systems/crite/crite_web/publications/sources/programmingv1.pdf
Cybenko, G. (1989). Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals, and Systems, 2(4), 303-314.
Calvo, I., Cabanes, I., Quesada, J., & Barambones, O. (2018). A multidisciplinary PBL approach for teaching industrial informatics and robotics in engineering. IEEE Transactions on Education, 61(1), 21-28. doi: 10.1109/TE.2017.2721907
Chen S. -Y., Lai C. -F., Lai Y. -H., & Su Y. -S. (Accepted, 2019). Effect of project-based learning on development of students’ creative thinking. International Journal of Electrical Engineering Education. doi: 10.1177/0020720919846808
Comaniciu, D., & Peter M. (2002). Mean Shift: A Robust Approach Toward Feature Space Analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(5), 603-619. doi: 10.1109/34.1000236
Chonkaew, P., Sukhummek, B., & Faikhamta, C. (2016). Development of analytical thinking ability and attitudes towards science learning of grade-11 students through science technology engineering and mathematics (STEM education) in the study of stoichiometry. Chemistry Education Research and Practice, 17(4), 842-861. doi: 10.1039/C6RP00074F
Cortes, C. & Vapnik, V. (1995). Support-vector networks. Machine Learning, 20(3), 273-297. doi: 10.1007/BF00994018.
Chen, H.-L., Widarso, G., & Sutrisno, H. (2020). A ChatBot for learning chinese: Learning achievement and technology acceptance. Journal of Educational Computing Research, 58(6), 1161-1189. doi: 10.1177/0735633120929622.
dos Santos, S.C. (2017). PBL-SEE: An authentic assessment model for PBL-based software engineering education. IEEE Transactions on Education, 60(2), 120-126. doi: 10.1109/TE.2016.2604227
Díez, L. F., Valencia, A., & Bermudez, J. (2017). Agent-based model for the analysis of technological acceptance of mobilelearning, IEEE Latin America Transactions, 15(6), 1121-1127, doi: 10.1109/TLA.2017.7932700.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: a comparison of two theoretical models. Management science, 35(8), 982-1003.
Eckel, B., Allison, C. D., & Allison, C. (2003). Thinking in C++, Vol. 2: Practical programming (2nd ed.). Upper Saddle River, NJ: Prentice Hall
Gomoll, A., Hmelo-Silver, C. E., Sabanovic, S., & Francisco, M. (2016). Dragons, ladybugs, and softballs: Girls′ STEM engagement with human-centered robotics. Journal of Science Education and Technology, 25(6), 899-914. doi: 10.1007/s10956-016-9647-z
Gao, S., Moe, S. P., & Krogstie, J. (2010). An empirical test of the mobile services acceptance model. Proceedings of the 2010 Ninth International Conference on Mobile Business and 2010 Ninth Global Mobility Roundtable (ICMB-GMR), 168-175, doi: 10.1109/ICMB-GMR.2010.51.
Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational researcher, 42(1), 38-43.
Hung, Y. -C. (2008). The effect of problem-solving instruction on computer engineering majors’ performance in Verilog programming. IEEE Transaction on Education, 51(1), 131-137. doi: 10.1109/TE.2007.906912
Hou, H. -T. (2015). Computers in human behavior integrating cluster and sequential analysis to explore learners ’ flow and behavioral patterns in a simulation game with situated-learning context for science courses : A video-based process exploration. Computers in Human Behavior, 48, 424-435.
Hsieh, J. S. C., Huang, Y. -M., & Wu, W. -C. V. (2017). Technological acceptance of LINE in flipped EFL oral training. Computers in Human Behavior, 70, 178-190.
Huang, A.Y. Q., Lu, O. H. T., Huang, J. C. H., Yin, C. J., & Yang, S. J. H. (2020). Predicting students’ academic performance by using educational big data and learning analytics: evaluation of classification methods and learning logs. Interactive Learning Environments, 28(2), 206-230. doi: 10.1080/10494820.2019.1636086
Hanley, J. A., & McNeil, B. J. (1983). A method of comparing the areas under receiver operating characteristic curves derived from the same cases. Radiology, 148(3), 839-843. doi: 10.1148/radiology.148.3.6878708
Hadad, R., Thomas, K., & Kachovska, M. (2020). Practicing formative assessment for Computational thinking in making Environments. J Sci Educ Technol, 29, 162-173. doi: 10.1007/s10956-019-09796-6
Hämäläinen, W., & Vinni, M. (2011). Classifiers for educational data mining. London: Chapman & Hall/CRC. doi: 10.1201/b10274-7
International Society for Technology in Education [ISTE]. (2010). Exploring computational thinking. Retrieved May 27, 2020, from https://www.google.com/edu/computational-thinking/
Klingsieck, K. B., Fries, S., Horz, C., & Hofer, M. (2012). Procrastination in a distance university setting. Distance education. Distance Education, 33(3), 295-310. doi: 10.1080/01587919.2012.723165
Landis, J., & Koch, G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159-174. doi: 10.2307/2529310
Lye, S. Y., & Koh, J. H. L. (2014). Review on teaching and learning of computational thinking through programming: What is next for K-12?. Computers in Human Behavior, 41, 51-61. doi: 10.1016/j.chb.2014.09.012
LaForce, M., Noble, E., & Blackwell, C. K. (2017). Problem-based learning (PBL) and student interest in STEM careers: The roles of motivation and ability beliefs. Education Sciences, 7(4), 92. doi: 10.3390/educsci7040092
Marin, S. L. T., Garcia, F. J. B., Torres, R. M., Vazquez, S. G., & Moreno, A. J. L. (2005). Implementation of a web-based educational tool for digital signal processing teaching using the technological acceptance model. IEEE Transactions on Education, 48(4), 632-641, doi: 10.1109/TE.2005.853074.
Martínez-Torres, M. R., Marín, S. L. T., García, F. B., Vázquez, S. G., Oliva, M. A., & Torres, T. (2008). A technological acceptance of e-learning tools used in practical and laboratory teaching, according to the European higher education area, Behaviour & Information Technology, 27(6), 495-505, doi: 10.1080/01449290600958965
Mannila, L., Peltomäki, M., & Salakoski, T. (2006). What about a simple language? Analyzing the difficulties in learning to program. Computer science education, 16(3), 211-227.
Manasijevic, D., Zivkovic, D., Arsic, S., & Milošević, I. (2016). Exploring students’ purposes of usage and educational usage of Facebook. Computers in Human Behavior, 60, 441-450.
Newhouse, C. (2017). STEM the boredom: Engage students in the Australian curriculum using ICT with problem-based learning and assessment. Journal of Science Education and Technology, 26(1), 44-57. doi: 10.1007/s10956-016-9650-4
Osmanbegović, E., Suljic, M., & Agić, H. (2015). Determing dominant factors for students performance prediction by using data mining classification algorithms. Tranzicija, 16, 147-158.
Rousseeuw, P. J. (1987). Silhouettes: a graphical aid to the interpretation and validation of cluster analysis. Computational and Applied Mathematics. 20, 53-65. doi: 10.1016/0377-0427(87)90125-7
Parmar, D., Babu, S. V., Lin, L., Jörg, S., D′Souza, N., Leonard, A. E., & Daily, S. B. (2016). Can embodied interaction and virtual peer customization in a virtual programming environment enhance computational thinking?. Proceedings of the Research on Equity and Sustained Participation in Engineering, Computing, and Technology. 1-2. doi: 10.1109/RESPECT.2016.7836179
Pekrun, R., Goetz, T., Perry, R. P., Kramer, K., Hochstadt, M., & Molfenter, S. (2004). Beyond test anxiety: Development and validation of the test emotions questionnaire (TEQ). Anxiety stress & coping, 17(3), 287-316. doi: 10.1080/10615800412331303847
Powers, K., Ecott, S., & Hirshfield, L. M. (2007). Through the looking glass: Teaching CS0 with Alice. Proceedings of the 38th SIGCSE technical symposium on Computer science education, 213-217.
Rietz, T., Benke, I., & Maedche, A. (2019). The impact of anthropomorphic and functional chatbot design features in enterprise collaboration systems on user acceptance. Proceedings of the 14th International Conference on Wirtschaftsinformatik, 1642-1656.
Romero, C., Espejo, P., Romero, R., &Ventura, S. (2013). Web usage mining for predicting final marks of students that use Moodle courses. Computer Applications in Engineering, 21(1), 135-146. doi: 10.1002/cae.20456
Romero, C., & Ventura, S. (2010) Educational data mining: A review of the state of the art. IEEE Transaction on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 40(6), 601-618. doi: 10.1109/TSMCC.2010.2053532
Romero, C., & Ventura, S. (2013). Data mining in education. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 3(1), 12-27. doi: 10.1002/widm.1075
Strano, M., & Colosimo, B.M. (2006). Logistic regression analysis for experimental determination of forming limit diagrams. International Journal of Machine Tools and Manufacture, 46(6), 673-682. doi: 10.1016/j.ijmachtools.2005.07.005
Su, Y. -S., Ding, T. -J., & Lai, C. -F. (2017). Analysis of students engagement and learning performance in a social community supported computer programming course. Eurasia Journal of Mathematics, Science & Technology Education, 13(9), 6189-6201. doi: 10.12973/eurasia.2017.01058a
Stigler, J., Geller, E., & Givvin, K. (2015). Zaption: A platform to support teaching, and learning about teaching, with video. Journal of E-Learning and Knowledge Society, 11(2), 13-25. doi: 10.20368/1971-8829/1042
Sánchez, R. A., Hueros, A. D., & Ordaz, M. O. (2013). E‐learning and the University of Huelva: a study of WebCT and the technological acceptance model. Campus-wide information systems. Campus-Wide Information Systems, 30(2), 135-160. doi: 10.1108/10650741311306318
Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. Educational Research Review, 22, 142-158.
Su, Y.S., Yang, J.H., Hwang, W.Y., Huang, S.J., & Tern, M.Y. (2014). Investigating the role of computer-supported annotation in problem solving based teaching: An empirical study of a scratch programming pedagogy. British Journal of Educational Technology, 45(4), 647-665. doi: 10.1111/bjet.12058
Tang, K.Y., Hsiao, C.H., & Su, Y.S. (2019). Networking for educational innovations: A bibliometric survey of international publication patterns. Sustainability, 11(17), 4608. doi: 10.3390/su11174608
Tang, K., Chou, T., & Tsai, C.(2020). A content analysis of computational thinking research: an international publication trends and research typology. Asia-Pacific Edu Res 29, 9-19. doi: 10.1007/s40299-019-00442-8
Tseng, K.-H., Chang, C.-H., Lou, S.-J., & Chen, W.-P. (2013). Attitudes towards science, technology, engineering and mathematics (STEM) in a project-based learning (PjBL) environment. International Journal of Technology and Design Education, 23(1), 87-102. doi: 10.1007/s10798-011-9160-x
Uysal, M.P. (2014). Improving first computer programming experiences: The case of adapting a web-supported and well-structured problem-solving method to a traditional course. Contemporary Educational Technology, 5(3), 198-217.
Vattani, A. (2011). K-means requires exponentially many iterations even in the plane. Discrete and Computational Geometry, 45(4), 596-616. doi: 10.1007/s00454-011-9340-1
Van Rijsbergen, C. J. (1979). Information retrieval (2nd ed.). London: Butterworths
Voogt, J., Fisser, P., Good, J., Mishra, P., & Yadav, A. (2015). Computational thinking in compulsory education: Towards an agenda for research and practice. Education and Information Technologies, 20(4), 715-728. doi: 10.1007/s40299-019-00442-8
Wing, J. M. (2006, March). Computational thinking. Communications of the ACM, 49(3), 33-35.
Wit.ai (2020). Wit.ai. Retrieved May 27, 2020, from https://wit.ai/
Wild, K. -P., & Schiefele, U. (1994). Lernstrategien im studium: Ergebnisse zur faktorenstruktur und reliabilität eines neuen fragebogens. Zeitschrift für Differentielle und Diagnostische Psychologie, 15(4), 185-200.
Wahyu, Y., Suastra, I. W., Sadia, I. W., & Suarni, N. K. (2020). The effectiveness of mobile augmented reality assisted STEM-based learning on scientific literacy and students’ achievement. International Journal of Instruction, 13(3), 343-356. doi: 10.29333/iji.2020.13324a
中文部分
大學程式能力檢定委員會 (民 99)。大學程式能力檢定。(民109年5月27日),取自https://cpe.cse.nsysu.edu.tw/index.php
邦妮科技 (民 109)。BotBonnie。(民109年5月27日),取自https://botbonnie.com/
林坤誼 (民 103)。STEM 科際整合教育培養整合理論與實務的科技人才。科技與人力教育季刊,1(1),1。
教育部 (民 92)。創造力教育政策白皮書。(民109年5月27日),取自https://ws.moe.edu.tw/001/Upload/3/RelFile/6315/6934/92.03%E5%89%B5%E9%80%A0%E5%8A%9B%E6%95%99%E8%82%B2%E7%99%BD%E7%9A%AE%E6%9B%B8.pdf
教育部智慧創新跨域人才培育計畫 (民 108)。大學程式設計先修檢測。(民109年5月27日),取自https://apcs.csie.ntnu.edu.tw/index.php/apcs-introduction/
教育部 (民 108)。運算思維。(民 109年5月27日) ,取自https://ossacc.moe.edu.tw/computational.php
許宜婷 (民 104)。科技教育教學內容之探討。科技與人力教育季刊,2(2),16-29。
簡紅珠 (民 99)。講述教學法,(民109年5月27日)。取自http://terms.naer.edu.tw/detail/1315014/
楊孟山、林宜玄 (民 107)。Maker 教育理論與實踐。臺灣教育評論月刊,7(2),29-38。
葉俊巖、羅希哲(民 104)。以Maker的角度來看臺灣小學的資訊教育。臺灣教育評論月刊,4(12),110-114。
指導教授 蘇育生(Yu-Sheng Su) 審核日期 2020-8-20
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明