參考文獻 |
[1] 衛生福利部中央健康保險署,2019-2020全民健康保險年報,衛生福利部中央健康保險署,2018年12月。
[2] 衛生福利部中央健康保險署,「110年1月份全民健康保險業務執行報告」,衛生福利部全民健康保險會第5屆110年第1次委員會議,5-22頁,2021年1月。
[3] 司法院大法官釋字第472號。
[4] Harrington, Peter. “Machine learning in action.”, Simon and Schuster, 2012.
[5] 曾婉菁,機器學習探究,印刷科技,2018。
[6] Samuel, A. L., “Some studies in machine learning using the game of checkers.”, IBM Journal of research and development, 3(3), pp. 210-229, 1959.
[7] 陳淑雲等,「健保費欠費經行政執行各階段收回成效之探討-以中區業務組投保單位為例」,衛生福利部研究發展計畫,2012。
[8] 鄭舒琪,「運用健保資料庫分析全民健保第一類投保單位欠費特性」,國立交通大學,碩士論文,2017。
[9] 江碧君等,「健保解卡對民營機構欠費及其負責人醫療利用影響」,衛生福利部研究發展計畫,2019。
[10] 陳雅珊等,「影響欠費單位移送執行之模型及其評估-以高屏業務組為例」,衛生福利部研究發展計畫,2019。
[11] Sun, Z., Wiering, M. A., & Petkov, N., “Classification system for mortgage arrear management.”, 2014 IEEE Conference on Computational Intelligence for Financial Engineering & Economics (CIFEr), pp. 489-496, IEEE, March 2014.
[12] Wang, J. M., & Wen, Y. Q., “Application of data mining in arrear risks prediction of power customer.”, 2008 International Symposium on Knowledge Acquisition and Modeling, pp. 206-210, IEEE, December 2008.
[13] Feldman, D., & Gross, S., “Mortgage default: classification trees analysis.”, The Journal of Real Estate Finance and Economics, 30(4), pp. 369-396, 2005.
[14] 鄭茂松,「利用資料探勘技術建立破產預測模型」,國立中央大學,碩士論文,2016。
[15] Lee, T. S., Chiu, C. C., Chou, Y. C., & Lu, C. J., “Mining the customer credit using classification and regression tree and multivariate adaptive regression splines.”, Computational Statistics & Data Analysis, 50(4), pp. 1113-1130, 2006.
[16] Baesens, B., Van Gestel, T., Viaene, S., Stepanova, M., Suykens, J., & Vanthienen, J., “Benchmarking state-of-the-art classification algorithms for credit scoring.”, Journal of the operational research society, 54(6), pp. 627-635, 2003.
[17] Atiya, A. F., “Bankruptcy prediction for credit risk using neural networks: A survey and new results.”, IEEE Transactions on neural networks, 12(4), pp. 929-935, 2001.
[18] Kurt, I., Ture, M., & Kurum, A. T., “Comparing performances of logistic regression, classification and regression tree, and neural networks for predicting coronary artery disease.”, Expert systems with applications, 34(1), pp. 366-374, 2008.
[19] Naraei, P., Abhari, A., & Sadeghian, A., “Application of multilayer perceptron neural networks and support vector machines in classification of healthcare data.”, 2016 Future Technologies Conference (FTC), pp. 848-852, IEEE, December 2016.
[20] Pal, M., “Random forest classifier for remote sensing classification.”, International journal of remote sensing, 26(1), pp. 217-222, 2005.
[21] Ribeiro, M. H. D. M., & dos Santos Coelho, L., “Ensemble approach based on bagging, boosting and stacking for short-term prediction in agribusiness time series.”, Applied Soft Computing, 86, 105837, 2020.
[22] Chan, J. C. W., & Paelinckx, D., “Evaluation of Random Forest and Adaboost tree-based ensemble classification and spectral band selection for ecotope mapping using airborne hyperspectral imagery.”, Remote Sensing of Environment, 112(6), pp. 2999-3011, 2008.
[23] Kira, K., & Rendell, L. A., “A practical approach to feature selection.”, Machine learning proceedings 1992, pp. 249-256, Morgan Kaufmann, 1992.
[24] Shannon, C. E., “A mathematical theory of communication.”, The Bell system technical journal, 27(3), pp. 379-423, 1948.
[25] Holland, J., “Adaptation in natural and artificial systems: an introductory analysis with application to biology.”, Control and artificial intelligence, 1975.
[26] Cunningham, P., Cord, M., & Delany, S. J., “Supervised learning.”, Machine learning techniques for multimedia, pp. 21-49, Springer, Berlin, Heidelberg, 2008.
[27] Safavian, S. R., & Landgrebe, D., “A survey of decision tree classifier methodology.”, IEEE transactions on systems, man, and cybernetics, 21(3), pp. 660-674, 1991.
[28] Song, Y. Y., & Ying, L. U., “Decision tree methods: applications for classification and prediction.”, Shanghai archives of psychiatry, 27(2), pp. 130-135, 2015.
[29] Breiman, L., Friedman, J., Stone, C. J., & Olshen, R. A., “Classification and regression trees.”, CRC press, 1984.
[30] McCulloch, W. S., & Pitts, W., “A logical calculus of the ideas immanent in nervous activity.”, The bulletin of mathematical biophysics, 5(4), pp. 115-133, 1943.
[31] Werbos, P., “Beyond regression:" new tools for prediction and analysis in the behavioral sciences.”, Ph. D. dissertation, Harvard University, 1974.
[32] McClelland, J. L., Rumelhart, D. E., & PDP Research Group., Parallel distributed processing, Vol. 2, pp. 20-21, Cambridge, MA: MIT press, 1986.
[33] Gardner, M. W., & Dorling, S. R., “Artificial neural networks (the multilayer perceptron)—a review of applications in the atmospheric sciences.”, Atmospheric environment, 32(14-15), pp. 2627-2636, 1998.
[34] Cortes, C., & Vapnik, V., “Support-vector networks. Machine learning”, 20(3), pp. 273-297, 1995.
[35] Meyer, D., Leisch, F., & Hornik, K., “The support vector machine under test.”, Neurocomputing, 55(1-2), pp. 169-186, 2003.
[36] Suthaharan, S., “Support vector machine.”, Machine learning models and algorithms for big data classification, pp. 207-235, Springer, Boston, MA, 2016.
[37] Opitz, D., & Maclin, R., “Popular ensemble methods: An empirical study.”, Journal of artificial intelligence research, 11, pp. 169-198, 1999.
[38] Breiman, L., “Bagging predictors.”, Machine learning, 24(2), pp. 123-140, 1996.
[39] Freund, Y., & Mason, L., “The alternating decision tree learning algorithm.”, icml, Vol. 99, pp. 124-133, June 1999.
[40] Freund, Y., & Schapire, R. E., “A decision-theoretic generalization of on-line learning and an application to boosting.”, Journal of computer and system sciences, 55(1), pp. 119-139, 1997.
[41] Ho, T. K., “Random decision forests.”, Proceedings of 3rd international conference on document analysis and recognition, Vol. 1, pp. 278-282, IEEE, August 1995.
[42] Breiman, L., “Random forests.”, Machine learning, 45(1), pp. 5-32, 2001.
[43] Priyam, A., Abhijeeta, G. R., Rathee, A., & Srivastava, S., “Comparative analysis of decision tree classification algorithms.”, International Journal of current engineering and technology, 3(2), pp. 334-337, 2013
[44] Timofeev, R., “Classification and regression trees (CART) theory and applications.”, Humboldt University, Berlin, pp. 1-40, 2004.
[45] Hassoun, M. H., Fundamentals of artificial neural networks., MIT press, 1995.
[46] Anderson, J. A., “ An introduction to neural networks.”, MIT press.,1995.
[47] Byvatov, E., & Schneider, G., “Support vector machine applications in bioinformatics.”, Applied bioinformatics, 2(2), pp. 67-77, 2003
[48] Dietterich, T. G., “Ensemble methods in machine learning.”, International workshop on multiple classifier systems, pp. 1-15, Springer, Berlin, Heidelberg, June 2000.
[49] 曾憲雄等,資料探勘,旗標科技股份有限公司,2005。 |