博碩士論文 112524601 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:108 、訪客IP:18.227.134.133
姓名 馬恆達(Mahendra Astu Sanggha Pawitra)  查詢紙本館藏   畢業系所 網路學習科技研究所
論文名稱 整合預測分析與學習儀表板以提升準時畢業率: 以印尼高等教育為例
(Integrating Predictive Analytics and Learning Dashboards to Improve Graduation Timeliness: A Study of Higher Education in Indonesia)
相關論文
★ 基於間隔效應與知識追蹤之適性化學習演算法系統設計與應用:以多益英語學習為例★ 結合社會調節學習平台與教中學課程設計以增進大學生視覺化資料分析能力與調節學習
★ 以深度知識追蹤模型應用於程式學習系統★ 結合聊天機器人與推薦系統之閱讀學伴應用於國小閱讀
★ 視覺化閱讀歷程系統於國小身教式持續安靜閱讀之應用★ 基於文本型程式編寫紀錄之自我調節儀表板於程式設計學習成效探究
★ 結合重新設計之翻轉教室模型與視覺化分析系統對於程式設計學習之影響★ 結合視覺化儀表板與合作腳本輔助VR共創活動以探討國小學童之學習行為、情感與認知參與
★ 結合視覺化儀表板之專案管理平台於在職學生專案能力與資料分析學習之影響★ 專題導向學習與調節學習儀表板應用於資料視覺化在職課程
★ 結合生成式人工智慧之探究式學習同伴系統以增進研究生資料視覺化素養能力★ 結合生成式人工智慧與4F動態回顧循環理論於國小閱讀學習同伴系統的應用與成效評估
★ 應用指數平滑法實現短期學習成效預測與學習歷程儀表板系統建置★ 應用生成式模型輔助問題生成學習系統於國小社會 課程之研究
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2029-7-26以後開放)
摘要(中) 對於高等教育機構和學生而言,按期完成學業是重要的指標,它代表著教育體系的效率和學生的成就。不過,辨識和解決延誤畢業的問題仍然是一項艱鉅任務。這項研究屬於教育資料挖掘範疇,目的在於結合機器學習技巧以調查哪些人口統計和學習行為因素會影響學生準時畢業。研究中還建立了學習分析的儀表板工具,旨在為教育管理者、教師及學生等關鍵群體提供預測畢業時間的見解及數據呈現功能。
本研究中的數據集源於印尼某高等教育機構的工程系所,涵蓋133位學生在2019至2023年四個學年的資料。進行數據清理後,即使用該記錄來進行研究。此項目採用CRISP-DM方法進行教育數據挖掘,並在創建學習分析儀表板系統時遵循瀑布模型。研究中結合了監督式與非監督式的機器學習技術。在監督式學習階段,開發多種模型如決策樹、kNN、SVM、樸素貝葉斯、隨機森林、邏輯回歸、梯度提升、隨機梯度下降和神經網路來預測學生的按時畢業率。而非監督式學習階段則使用K均值算法將學生分成三群。最終,在網站上部署的系統依ISO/IEC 25010標準,在WebQEM評價系統中根據可用性、功能性、效率和可靠性進行評估。
研究結果指出,學生的累積平均績點(CGPA)、第四學期的GPA 以及在程式設計、社會科學和英文能力測驗上的表現,是影響是否能按期畢業的主要學術變量。另外,在人口統計變數方面,性別、家長職業、高中專攻領域以及課外活動的參與程度,也顯著影響了學生按期畢業的可能性。模型建構的結果表明,隨機森林模型在評估指標上超過其他模型,展示出85%的分類準確率和88%的AUC(接收者操作特徵曲線下的面積)。效能測試揭示了系統平均的性能得分為82.2%,架構得分則為87.6%,並且在GTmetrix上獲得B等級評價。可靠性測試通過對線上部署網站進行壓力測試,無論在何種條件下都實現了100%的成功率。經過資深軟體工程師進行黑盒測試的功能性評估亦顯示了99.4%的高成功率。此外,根據可用性調查問卷的結果,開發的系統對教育工作者和學生而言是實用、容易學習、易於使用以及令人滿意的。整題而言,本研究提供了準時畢業的關鍵因素及所開發系統的寶貴見解,並為未來相關研究提出了結論與建議。
摘要(英) Graduating on schedule is a critical milestone for students in higher education institutions, reflecting both institutional effectiveness and student success. However, identifying and addressing factors that may delay timely completion poses significant challenges. This study is educational data mining research that aims to investigate the factors of on-time graduation in both students’ demographics and learning performance aspects by integrating machine learning approach. Furthermore, this study also develops a learning analytics dashboard which provides forecasts about on-time graduation and presents data visualization resources useful for various educational stakeholders including school officials, teachers, and students.
The dataset was collected from the academic system of an engineering department at a higher education institution in Indonesia. After the data cleaning process, it used 133 students’ recorded data for over four years of academic calendar years from 2019 to 2023. The method used in the educational data mining process of this research is CRISP-DM (Cross-Industry Standard Process for Data Mining) with the waterfall model implementation on the development of the learning analytics dashboard system. In the educational data mining process, this research used both supervised and unsupervised machine learning. For supervised learning, researchers build several machine learning models to predict on-time graduation, such as Decision Tree, kNN, SVM, Naïve Bayes, Random Forest, Logistic Regression, Gradient Boosting, Stochastic Gradient Descent, and Neural Network. Meanwhile, the unsupervised using K-Means algorithm divides students into three clusters. Furthermore, the developed system which has been deployed on the website was assessed with the ISO/IEC 25010 standard in accordance with the WebQEM standard factors such as usability, functionality, efficiency, and reliability.
The results showed that CGPA, GPA 4th semester, Programming, Social Science, and English proficiency score are variables with the most importance toward on-time graduation from the student’s learning performance information. From the demographic, student’s information about gender, parents’ occupation, high school major, and extracurricular involvement are the relevant variables which have high influence toward on-time graduation. The modeling process showed that Random Forest outperformed other models in the evaluation metrics with 85% Classification Accuracy and 88% AUC (Area Under ROC Curve). For the developed system performance, efficiency test results show 82.2% average Performance Score and 87.6% average Structure Score which give an overall Grade B of GTmetrix. The reliability test conducted stress testing to the deployed website delivered a 100% success rate in various scenarios. The functionality testing using BlackBox testing by experienced software engineers produced a 99.4% success rate. The insights obtained from the usability evaluation, through the administration of a usability questionnaire, provided proof that the developed system is considered beneficial, user-friendly, straightforward to learn, and satisfying for both educators and students. Overall, the result of this study contributes valuable implications toward on-time graduation factors and the developed system along with the conclusions and suggestions for future research.
關鍵字(中) ★ 準時畢業
★ 教育數據探勘
★ 校務研究
★ 學習分析儀表板
★ 機器學習
關鍵字(英) ★ On-time graduation
★ Educational data mining
★ Institutional research
★ Learning analytics dashboard
★ Machine learning
論文目次 中文摘要 i
Abstract iii
Acknowledgement v
Table of Contents vi
List of Figures ix
List of Tables x
List of Appendix xi
Chapter 1 Introduction 1
1-1 Background and Motivation 1
1-2 Purpose 3
1-3 Research Questions 3
Chapter 2 Literature Review 4
2-1 Educational Data Mining and Learning Analytics 4
2-1-1 Educational Data Mining 4
2-1-2 Approach in Educational Data Mining 6
2-1-3 Educational Data Mining-Related Applications 10
2-2 Institutional Research 12
2-2-1 Academic Performance Graduation 13
2-2-2 Socio-Economic 13
2-2-3 Educational Background 14
2-2-4 Admission Channels 15
2-2-5 Students’ Activities 16
2-3 Data-Driven Decision Dashboard 16
2-3-1 Learning Analytics Dashboard 17
2-3-2 Visual Analysis Definition 18
2-3-3 Visual Analysis Advantages and Disadvantages 19
2-3-4 Visual Analysis for Education 21
2-4 Assessment of Software Quality 22
Chapter 3 Research Method 25
3-1 Type of Research 25
3-2 Research Process 25
3-2-1 First Phase Process 25
3-2-2 Second Phase Process 29
3-3 Research Subject 31
3-4 Research Variables 31
3-4-1 Independent Variables 32
3-4-2 Dependent Variables 34
3-5 Research Tools 35
3-6 Data Collecting and Processing 38
3-6-1 Preliminary Data 38
3-6-2 Quantitative Data 42
3-6-3 Qualitative Data 45
Chapter 4 System Design and Implementation 46
4-1 System Design 46
4-2 Implementation 47
Chapter 5 Results and Analysis 56
5-1 Analysis of Educational Data 56
5-1-1 Feature Importance Analysis 56
5-1-2 Relevant Variables Analysis 59
5-2 Analysis of Model 68
5-3 Analysis of System Performance 71
5-4 Analysis of System Perception 74
5-4-1 Quantitative Analysis 74
5-4-2 Qualitative Analysis 80
Chapter 6 Discussion 83
6-1 Social Science Might be More Important Than Science Courses 83
6-2 Extracurricular Involvement Has Positive Correlation with On-Time Graduation 84
6-3 Educational Implications 85
Chapter 7 Conclusions and Future Works 87
7-1 Conclusions 87
7-1-1 Extracurricular and CGPA are the Most Significant Factors 87
7-1-2 Random Forest Outperformed Other Models 88
7-1-3 System Passed WebQEM Standard 88
7-1-4 System Perception of Students and Educators are Positive 89
7-2 Research Limitations and Future Works 91
References 92
Appendix 100



List of Figures
Figure 1. ISO/IEC 25010 23
Figure 2. WebQEM Model 23
Figure 3. CRISP-DM process (Wirth & Hipp, 2000) 26
Figure 4. CRISP-DM framework with horizontal sequence 26
Figure 5. CRISP-DM stages and processes 27
Figure 6. Second Phase of Research Process 30
Figure 7. Visual of Modeling Process 40
Figure 8. Preliminary Data Processing and Deployment Flow 42
Figure 9. System Design Architecture 46
Figure 10. Main Dashboard Feature 47
Figure 11. Distribution Feature 48
Figure 12. Relationship Feature 50
Figure 13. Clustering Feature 51
Figure 14. On-Time Graduation Prediction Feature 52
Figure 15. On Time Prediction 53
Figure 16. Data Visualization Explorer Feature 53
Figure 17. Custom Data Visualization Explorer Feature 55
Figure 18. Random Forest Feature Importance MDI Graph 59
Figure 19. On-Time Graduation Distribution 60
Figure 20. Average Semester 4 GPA on On-Time Graduate 61
Figure 21. Average of CGPA on On-Time Graduation 62
Figure 22. Stacked Bar of Extracurricular Distribution 62
Figure 23. Programming Score Distribution 63
Figure 24. Gender Distribution on On-Time Graduation 64
Figure 25. TOEFL Score Distribution 65
Figure 26. Social Science Score Distribution 66
Figure 27. Father’s Job Distribution on On-Time Graduation 67
Figure 28. Mother’s Job Distribution on On-Time Graduation 67
Figure 29. High School Major Distribution on On-Time Graduation 68
Figure 30. AUC (Area Under ROC Curve) Graph 70
Figure 31. Confusion Matrix of Random Forest Model on Testing Data 71
Figure 32. Scenario 20VUs Load Testing Graph 74
Figure 33. Scenario 50VUs Load Testing Graph 74

List of Tables
Table 1. Students Demographic Attribute 32
Table 2. Students’ Academic Attribute 34
Table 3. Target Feature 35
Table 4. Course Categorization 39
Table 5. USE Questionnaire 44
Table 6. Criteria of Cronbach’s α 45
Table 7. Feature Importance Score using Logistic Regression 57
Table 8. Feature Importance Score using Random Forest 58
Table 9. Model Evaluation Metrics 69
Table 10. GTMetrix Results 72
Table 11. Reliability Load Testing Scenarios 73
Table 12. Case Processing Summary of Reliability Statistics 75
Table 13. Questionnaire Reliability 75
Table 14. Reliability per Dimension 76
Table 15. Usefulness Dimension Analysis 76
Table 16. Ease of Use Dimension Analysis 77
Table 17. Ease of Learning Dimension Analysis 78
Table 18. Satisfaction Dimension Analysis 79
Table 19. Questionnaire Dimension Recap 80


List of Appendix
Appendix 1. Functionality Pass-Fail Decision 100
Appendix 2. Speed Testing Result of Homepage 101
Appendix 3. Speed Testing Result of Dashboard Page 102
Appendix 4. Load Chart of Dashboard Page 102
Appendix 5. Speed Testing Result of Prediction Page 103
Appendix 6. Speed Testing Result of Data Viz Page 103
Appendix 7. Speed Testing Result of Custom Data Page 103
Appendix 8. GTMetrix Analysis Graph of Homepage 104
Appendix 9. Scenario 20 VUs Load Testing 105
Appendix 10. Scenario 50VUs Load Testing 105
Appendix 11. Screenshot of Usability Questionnaire Cover 106
Appendix 12. Screenshot of Usability Questionnaire Instruction 107
Appendix 13. Screenshot of Usability Questionnaire Identity 107
Appendix 14. Screenshot of Usability Questionnaire 108
Appendix 15. Open-ended Questions 108
Appendix 16. Independent t-test Educators and Students Perceptions 109
參考文獻 Agrawal, R., Imielinski, T., & Swami, A. (1993). Database mining: A performance perspective. IEEE Transactions on Knowledge and Data Engineering, 5(6), 914–925. https://doi.org/10.1109/69.250074
Agrawal, R., & Srikant, R. (1995). Mining sequential patterns. Proceedings of the Eleventh International Conference on Data Engineering, 3–14. https://doi.org/10.1109/ICDE.1995.380415
Aini, Q., Fetrina, E., & Epriani, N. C. (2023). WebQual 4.0 Plus: An Approach to Measure Customer Satisfaction toward Website Quality. 2023 11th International Conference on Cyber and IT Service Management (CITSM), 1–6. https://doi.org/10.1109/CITSM60085.2023.10455371
Akçapınar, G., Altun, A., & Aşkar, P. (2019). Using learning analytics to develop early-warning system for at-risk students. International Journal of Educational Technology in Higher Education, 16(1), 40. https://doi.org/10.1186/s41239-019-0172-z
Ariyani, S., Sudarma, M., & Wicaksana, P. A. (2021). Analysis of Functional Suitability and Usability in Sales Order Procedure to Determine Management Information System Quality. INTENSIF: Jurnal Ilmiah Penelitian Dan Penerapan Teknologi Sistem Informasi, 5(2), 234–248. https://doi.org/10.29407/intensif.v5i2.15537
Arnold, K. E., Lonn, S., & Pistilli, M. D. (2014). An exercise in institutional reflection: The learning analytics readiness instrument (LARI). Proceedings of the Fourth International Conference on Learning Analytics And Knowledge, 163–167. https://doi.org/10.1145/2567574.2567621
Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: Using learning analytics to increase student success. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 267–270. https://doi.org/10.1145/2330601.2330666
Asthana, A., & Olivieri, J. (2009). Quantifying software reliability and readiness. 2009 IEEE International Workshop Technical Committee on Communications Quality and Reliability, 1–6. https://doi.org/10.1109/CQR.2009.5137352
Baker, R. S. (2019). Challenges for the Future of Educational Data Mining: The Baker Learning Analytics Prizes. 11(1).
Baker, R., & Siemens, G. (2014). Educational Data Mining and Learning Analytics. In R. K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences (2nd ed., pp. 253–272). Cambridge University Press. https://doi.org/10.1017/CBO9781139519526.016
Bakker, J., Denessen, E., & Brus‐Laeven, M. (2007). Socio‐economic background, parental involvement and teacher perceptions of these in relation to pupil achievement. Educational Studies, 33(2), 177–192. https://doi.org/10.1080/03055690601068345
Bañeres, D., Rodríguez, M. E., Guerrero-Roldán, A. E., & Karadeniz, A. (2020). An Early Warning System to Detect At-Risk Students in Online Higher Education. Applied Sciences, 10(13), 4427. https://doi.org/10.3390/app10134427
Bergdahl, N., Nouri, J., Karunaratne, T., Afzaal, M., & Saqr, M. (2020). Learning Analytics for Blended Learning: A Systematic Review of Theory, Methodology, and Ethical Considerations. International Journal of Learning Analytics and Artificial Intelligence for Education (iJAI), 2(2), 46. https://doi.org/10.3991/ijai.v2i2.17887
Blum, A., & Mitchell, T. (1998). Combining Labeled and Unlabeled Data with Co-Training y.
Breiman, L. (2001). Random Forests. In Machine Learning (Vol. 45, pp. 5–32).
Budiman, E., Puspitasari, N., Taruk, M., & Maria, E. (2019). Webqual 4.0 and ISO/IEC 9126 Method for website quality evaluation of higher education.
Cheung, S. K. S., Kwok, L. F., Phusavat, K., & Yang, H. H. (2021). Shaping the future learning environments with smart elements: Challenges and opportunities. International Journal of Educational Technology in Higher Education, 18(1), 16, s41239-021-00254–1. https://doi.org/10.1186/s41239-021-00254-1
Corbett, A. T., & Anderson, J. R. (1995). Knowledge tracing: Modeling the acquisition of procedural knowledge. User Modelling and User-Adapted Interaction, 4(4), 253–278. https://doi.org/10.1007/BF01099821
Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine Learning, 20(3), 273–297. https://doi.org/10.1007/BF00994018
Costa, R. S., Tan, Q., Pivot, F., Zhang, X., & Wang, H. (2021). Personalized and adaptive learning: Educational practice and technological impact.
Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297–334.
Cunningham, P., Cord, M., & Delany, S. J. (2008). Supervised Learning. In M. Cord & P. Cunningham (Eds.), Machine Learning Techniques for Multimedia (pp. 21–49). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-75171-7_2
Defrizal, D., Redaputri, A. P., Narundana, V. T., Nurdiawansyah, N., & Dharmawan, Y. Y. (2022). The Merdeka Belajar Kampus Merdeka Program: An Analysis of the Success Factors. Nusantara: Jurnal Pendidikan Indonesia, 2(1), 123–140. https://doi.org/10.14421/njpi.2022.v2i1-8
Diasti, K. S., & Mbato, C. L. (2020). Exploring Undergraduate Students’ Motivation-regulation Strategies in Thesis Writing. Language Circle: Journal of Language and Literature, 14(2), 176–183. https://doi.org/10.15294/lc.v14i2.23450
Direktorat Jenderal Pendidikan Tinggi Kementerian Pendidikan dan Kebudayaan. (2020). Buku Panduan Merdeka Belajar—Kampus Merdeka. Direktorat Jenderal Pendidikan Tinggi Kemendikbud RI.
Ferguson, R., & Buckingham Shum, S. (2012). Social learning analytics: Five approaches. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 23–33. https://doi.org/10.1145/2330601.2330616
Ferguson, R., & Clow, D. (2015). Examining engagement: Analysing learner subpopulations in massive open online courses (MOOCs). Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, 51–58. https://doi.org/10.1145/2723576.2723606
Friedman, J. H. (2001). Greedy function approximation: A gradient boosting machine. The Annals of Statistics, 29(5). https://doi.org/10.1214/aos/1013203451
Guerra, J., Ortiz-Rojas, M., Zúñiga‐Prieto, M. A., Scheihing, E., Jiménez, A., Broos, T., De Laet, T., & Verbert, K. (2020). Adaptation and evaluation of a learning analytics dashboard to improve academic support at three Latin American universities. British Journal of Educational Technology, 51(4), 973–1001. https://doi.org/10.1111/bjet.12950
Hastie, T., Tibshirani, R., & Friedman, J. (2009). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. (Vol. 2). Springer.
Herodotou, C., Hlosta, M., Boroowa, A., Rienties, B., Zdrahal, Z., & Mangafa, C. (2019). Empowering online teachers through predictive learning analytics. British Journal of Educational Technology, 50(6), 3064–3079. https://doi.org/10.1111/bjet.12853
Hochreiter, S., & Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, 9(8), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735
Hosmer, D. W., Lemeshow, S., & Sturdivant, R. X. (2013). Applied logistic regression (Third edition). Wiley.
Hung, H.-C., Liu, I.-F., Liang, C.-T., & Su, Y.-S. (2020). Applying Educational Data Mining to Explore Students’ Learning Patterns in the Flipped Learning Approach for Coding Education. Symmetry, 12(2), 213. https://doi.org/10.3390/sym12020213
Jain, A. K. (2010). Data clustering: 50 years beyond K-means. Pattern Recognition Letters, 31(8), 651–666. https://doi.org/10.1016/j.patrec.2009.09.011
Jin Huang, & Ling, C. X. (2005). Using AUC and accuracy in evaluating learning algorithms. IEEE Transactions on Knowledge and Data Engineering, 17(3), 299–310. https://doi.org/10.1109/TKDE.2005.50
Kabathova, J., & Drlik, M. (2021). Towards Predicting Student’s Dropout in University Courses Using Different Machine Learning Techniques. Applied Sciences, 11(7), 3130. https://doi.org/10.3390/app11073130
Keim, D., Andrienko, G., Fekete, J.-D., Görg, C., Kohlhammer, J., & Melançon, G. (2008). Visual Analytics: Definition, Process, and Challenges. In A. Kerren, J. T. Stasko, J.-D. Fekete, & C. North (Eds.), Information Visualization (Vol. 4950, pp. 154–175). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-70956-5_7
Kementrian Pendidikan dan Kebudayaan. (2024). Analitik PTNBH - Direktorat Kelembagaan Kemdikbud. https://sinta.kemdikbud.go.id/ptnbhanalytics/v2/affiliations/detail/430
Krishnapatria, K. (2021). Merdeka Belajar-Kampus Merdeka (MBKM) Curriculum in English Studies Program: Challenges and Opportunities. ELF in Focus, 4(1).
Ling, C. X., Huang, J., & Zhang, H. (2003). AUC: a Statistically Consistent and more Discriminating Measure than Accuracy.
Lund, A. M. (2001). Measuring Usability with the USE Questionnaire.
M. S. França, J., & S. Soares, M. (2015). SOAQM: Quality Model for SOA Applications based on ISO 25010: Proceedings of the 17th International Conference on Enterprise Information Systems, 60–70. https://doi.org/10.5220/0005369100600070
Macfadyen, L. P., & Dawson, S. (2024). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan.
Maqsood, R., Ceravolo, P., Ahmad, M., & Sarfraz, M. S. (2023). Examining students’ course trajectories using data mining and visualization approaches. International Journal of Educational Technology in Higher Education, 20(1), 55. https://doi.org/10.1186/s41239-023-00423-4
Marbouti, F., Ulas, J., & Wang, C.-H. (2021). Academic and Demographic Cluster Analysis of Engineering Student Success. IEEE Transactions on Education, 64(3), 261–266. https://doi.org/10.1109/TE.2020.3036824
Martínez, I. M., Youssef-Morgan, C. M., Chambel, M. J., & Marques-Pinto, A. (2019). Antecedents of academic performance of university students: Academic engagement and psychological capital resources. Educational Psychology, 39(8), 1047–1067. https://doi.org/10.1080/01443410.2019.1623382
Mat Nawi, F. A., Ahmad, N. L., Abdullah, M. Z., Omar, N. F., Dzulkarnain, N., Abu Bakar, S. M. S., & Mohd Fauzi, M. W. (2023). The Regression Analysis of Factors Contribute to University Students’ Academic Performance. Information Management and Business Review, 15(4(SI)I), 456–464. https://doi.org/10.22610/imbr.v15i4(SI)I.3620
Mitchell, T. M. (1997). Machine Learning. McGraw-Hill.
Munir, J., Faiza, M., Jamal, B., Daud, S., & Iqbal, K. (2023). The Impact of Socio-economic Status on Academic Achievement. Journal of Social Sciences Review, 3(2), 695–705. https://doi.org/10.54183/jssr.v3i2.308
Nielsen, J. (1994). Usability Engineering. Morgan Kaufmann Publishers Inc.
Olsina, L., & Rossi, G. (2002). Measuring Web application quality with WebQEM. IEEE Multimedia, 9(4), 20–29. https://doi.org/10.1109/MMUL.2002.1041945
Papamitsiou, Z., & Economides, A. A. (2014). Learning Analytics and Educational Data Mining in Practice: A Systematic Literature Review of Empirical Evidence. Educational Technology & Society, 17(4), 49–64.
Parra, D., & Brusilovsky, P. (2015). User-controllable personalization: A case study with SetFusion. International Journal of Human-Computer Studies, 78, 43–67. https://doi.org/10.1016/j.ijhcs.2015.01.007
Pereira, F. D., Oliveira, E. H. T., Oliveira, D. B. F., Cristea, A. I., Carvalho, L. S. G., Fonseca, S. C., Toda, A., & Isotani, S. (2020). Using learning analytics in the Amazonas: Understanding students’ behaviour in introductory programming. British Journal of Educational Technology, 51(4), 955–972. https://doi.org/10.1111/bjet.12953
Picciano, A. G. (2012). The Evolution of Big Data and Learning Analytics in American Higher Education. Online Learning, 16(3). https://doi.org/10.24059/olj.v16i3.267
Prinsloo, P., & Slade, S. (2017). An elephant in the learning analytics room: The obligation to act. Proceedings of the Seventh International Learning Analytics & Knowledge Conference, 46–55. https://doi.org/10.1145/3027385.3027406
Purwoningsih, T., Santoso, H. B., & Hasibuan, Z. A. (2020). Data Analytics of Students’ Profiles and Activities in a Full Online Learning Context. 2020 Fifth International Conference on Informatics and Computing (ICIC), 1–8. https://doi.org/10.1109/ICIC50835.2020.9288540
Quinlan, J. R. (1986). Induction of decision trees. Machine Learning, 1(1), 81–106. https://doi.org/10.1007/BF00116251
Rahman, S. R., Islam, Md. A., Akash, P. P., Parvin, M., Moon, N. N., & Nur, F. N. (2021). Effects of co-curricular activities on student’s academic performance by machine learning. Current Research in Behavioral Sciences, 2, 100057. https://doi.org/10.1016/j.crbeha.2021.100057
Rienties, B., Lewis, T., McFarlane, R., Nguyen, Q., & Toetenel, L. (2018). Analytics in online and offline language learning environments: The role of learning design to understand student online engagement. Computer Assisted Language Learning, 31(3), 273–293. https://doi.org/10.1080/09588221.2017.1401548
Romero, C., & Ventura, S. (2010). Educational Data Mining: A Review of the State of the Art. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 40(6), 601–618. https://doi.org/10.1109/TSMCC.2010.2053532
Romero, C., & Ventura, S. (2020). Educational data mining and learning analytics: An updated survey. WIREs Data Mining and Knowledge Discovery, 10(3), e1355. https://doi.org/10.1002/widm.1355
Saito, T., & Rehmsmeier, M. (2015). The Precision-Recall Plot Is More Informative than the ROC Plot When Evaluating Binary Classifiers on Imbalanced Datasets. PLOS ONE, 10(3), e0118432. https://doi.org/10.1371/journal.pone.0118432
Scudder, H. (1965). Probability of error of some adaptive pattern-recognition machines. IEEE Transactions on Information Theory, 11(3), 363–371. https://doi.org/10.1109/TIT.1965.1053799
Shiao, Y.-T., Chen, C.-H., Wu, K.-F., Chen, B.-L., Chou, Y.-H., & Wu, T.-N. (2023). Reducing dropout rate through a deep learning model for sustainable education: Long-term tracking of learning outcomes of an undergraduate cohort from 2018 to 2021. Smart Learning Environments, 10(1), 55. https://doi.org/10.1186/s40561-023-00274-6
Shneiderman, B. (2019). The Emergence of Human-Computer Interaction. In B. Shneiderman, Encounters with HCI Pioneers (pp. 1–23). Springer International Publishing. https://doi.org/10.1007/978-3-031-02224-1_1
Shulruf, B., Hattie, J., Turneraq, R., Tumen, S., & Li, M. (2009). Enhancing equal opportunities in higher education: A new merit-based admission model. Cypriot Journal of Educational Sciences.
Shute, V. J., & Zapata-Rivera, D. (2012). Adaptive Educational Systems. In P. J. Durlach & A. M. Lesgold (Eds.), Adaptive Technologies for Training and Education (1st ed., pp. 7–27). Cambridge University Press. https://doi.org/10.1017/CBO9781139049580.004
Siemens, G., & Baker, R. S. J. D. (2012). Learning analytics and educational data mining: Towards communication and collaboration. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 252–254. https://doi.org/10.1145/2330601.2330661
Syahidi, A. A., Asyikin, A. N., & Subandi, S. (2019). Measuring User Assessments and Expectations: The Use of WebQual 4.0 Method and Importance-Performance Analysis (IPA) to Evaluate the Quality of School Websites. Journal of Information Technology and Computer Science, 4(1), 76–89. https://doi.org/10.25126/jitecs.20194198
Tsai, S.-C., Chen, C.-H., Shiao, Y.-T., Ciou, J.-S., & Wu, T.-N. (2020). Precision education with statistical learning and deep learning: A case study in Taiwan. International Journal of Educational Technology in Higher Education, 17(1), 12. https://doi.org/10.1186/s41239-020-00186-2
VanLEHN, K. (2011). The Relative Effectiveness of Human Tutoring, Intelligent Tutoring Systems, and Other Tutoring Systems. Educational Psychologist, 46(4), 197–221. https://doi.org/10.1080/00461520.2011.611369
Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning Analytics Dashboard Applications. American Behavioral Scientist, 57(10), 1500–1509. https://doi.org/10.1177/0002764213479363
Von Davier, M., & Yamamoto, K. (2007). Mixture-Distribution and HYBRID Rasch Models. In C. H. Carstensen (Ed.), Multivariate and Mixture Distribution Rasch Models (pp. 99–115). Springer New York. https://doi.org/10.1007/978-0-387-49839-3_6
Wang, M., & Fredricks, J. A. (2014). The Reciprocal Links Between School Engagement, Youth Problem Behaviors, and School Dropout During Adolescence. Child Development, 85(2), 722–737. https://doi.org/10.1111/cdev.12138
Wibowo, A. T., & Fitrianah, D. (2018). A K-NEAREST ALGORITHM BASED APPLICATION TO PREDICT SNMPTN ACCEPTANCE FOR HIGH SCHOOL STUDENTS IN INDONESIA. International Research Journal of Computer Science, 5(01).
Wirth, R., & Hipp, J. (2000). CRISP-DM: Towards a Standard Process Model for Data Mining.
Yağcı, M. (2022). Educational data mining: Prediction of students’ academic performance using machine learning algorithms. Smart Learning Environments, 9(1), 11. https://doi.org/10.1186/s40561-022-00192-z
Yang, T.-C., Liu, Y.-L., & Wang, L.-C. (2021). Using an Institutional Research Perspective to Predict Undergraduate Students’ Career Decisions in the Practice of Precision Education. Educational Technology & Society, 24(1), 280–296.
Yau, C., Karimzadeh, M., Surakitbanharn, C., Elmqvist, N., & Ebert, D. S. (2019). Bridging the Data Analysis Communication Gap Utilizing a Three‐Component Summarized Line Graph. Computer Graphics Forum, 38(3), 375–386. https://doi.org/10.1111/cgf.13696
Young, T., Hazarika, D., Poria, S., & Cambria, E. (2018). Recent Trends in Deep Learning Based Natural Language Processing (arXiv:1708.02709). arXiv. http://arxiv.org/abs/1708.02709
指導教授 洪暉鈞(Hui-Chun Hung) 審核日期 2024-7-29
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明