參考文獻 |
1. Fayyad, U., G. Piatetsky-Shapiro, and P. Smyth, From data mining to knowledge discovery in databases. AI magazine, 1996. 17(3): p. 37-37.
2. Pyle, D., Data preparation for data mining. 1999: morgan kaufmann.
3. Kotsiantis, S.B., D. Kanellopoulos, and P.E. Pintelas, Data preprocessing for supervised leaning. International journal of computer science, 2006. 1(2): p. 111-117.
4. Li, J., et al., Feature selection: A data perspective. ACM computing surveys (CSUR), 2017. 50(6): p. 1-45.
5. Brighton, H. and C. Mellish, Advances in instance selection for instance-based learning algorithms. Data mining and knowledge discovery, 2002. 6: p. 153-172.
6. García-Pedrajas, N. and A. de Haro-García, Boosting instance selection algorithms. Knowledge-Based Systems, 2014. 67: p. 342-360.
7. Zhai, Y., Y.-S. Ong, and I.W. Tsang, The emerging" big dimensionality". IEEE Computational Intelligence Magazine, 2014. 9(3): p. 14-26.
8. Jović, A., K. Brkić, and N. Bogunović. A review of feature selection methods with applications. in 2015 38th international convention on information and communication technology, electronics and microelectronics (MIPRO). 2015. Ieee.
9. Liu, H. and R. Setiono. A probabilistic approach to feature selection-a filter solution. in ICML. 1996.
10. Fonti, V. and E. Belitser, Feature selection using lasso. VU Amsterdam research paper in business analytics, 2017. 30: p. 1-25.
11. Zhao, H., S. Wang, and Z. Wang, Multiclass classification and feature selection based on least squares regression with large margin. Neural Computation, 2018. 30(10): p. 2781-2804.
12. Izetta, J., P.F. Verdes, and P.M. Granitto, Improved multiclass feature selection via list combination. Expert Systems with Applications, 2017. 88: p. 205-216.
13. Cascaro, R.J., B.D. Gerardo, and R.P. Medina. Filter selection methods for multiclass classification. in Proceedings of the 2nd International Conference on Computing and Big Data. 2019.
14. Yijing, L., et al., Adapted ensemble classification algorithm based on multiple classifier system and feature selection for classifying multi-class imbalanced data. Knowledge-Based Systems, 2016. 94: p. 88-104.
15. Xu, J., An extended one-versus-rest support vector machine for multi-label classification. Neurocomputing, 2011. 74(17): p. 3114-3124.
16. Cortes, C. and V. Vapnik, Support-vector networks. Machine learning, 1995. 20: p. 273-297.
17. Fang, C.L., et al., Instance selection using one‐versus‐all and one‐versus‐one decomposition approaches in multiclass classification datasets. Expert Systems, 2023. 40(6): p. e13217.
18. Sikora, R. and S. Piramuthu, Framework for efficient feature selection in genetic algorithm based data mining. European Journal of Operational Research, 2007. 180(2): p. 723-737.
19. Dash, M. and H. Liu, Feature selection for classification. Intelligent data analysis, 1997. 1(1-4): p. 131-156.
20. Chandrashekar, G. and F. Sahin, A survey on feature selection methods. Computers & electrical engineering, 2014. 40(1): p. 16-28.
21. Wang, S., et al., Pathological brain detection by artificial intelligence in magnetic resonance imaging scanning (invited review). Progress in Electromagnetics Research, 2016. 156: p. 105-133.
22. Saeys, Y., I. Inza, and P. Larranaga, A review of feature selection techniques in bioinformatics. bioinformatics, 2007. 23(19): p. 2507-2517.
23. Kraskov, A., H. Stögbauer, and P. Grassberger, Estimating mutual information. Physical review E, 2004. 69(6): p. 066138.
24. Ding, H., et al., Identification of bacteriophage virion proteins by the ANOVA feature selection and analysis. Molecular BioSystems, 2014. 10(8): p. 2229-2235.
25. Thaseen, I.S. and C.A. Kumar, Intrusion detection model using fusion of chi-square feature selection and multi class SVM. Journal of King Saud University-Computer and Information Sciences, 2017. 29(4): p. 462-472.
26. Urbanowicz, R.J., et al., Relief-based feature selection: Introduction and review. Journal of biomedical informatics, 2018. 85: p. 189-203.
27. Guan, S.-U., Y. Qi, and C. Bao, An incremental approach to MSE-based feature selection. International Journal of Computational Intelligence and Applications, 2006. 6(04): p. 451-471.
28. Krishnan, G.S. and S. Kamath, A novel GA-ELM model for patient-specific mortality prediction over large-scale lab event data. Applied Soft Computing, 2019. 80: p. 525-533.
29. Huang, C.-L. and J.-F. Dun, A distributed PSO–SVM hybrid system with feature selection and parameter optimization. Applied soft computing, 2008. 8(4): p. 1381-1391.
30. Mafarja, M. and S. Mirjalili, Whale optimization approaches for wrapper feature selection. Applied Soft Computing, 2018. 62: p. 441-453.
31. Zhu, M. and J. Song, An embedded backward feature selection method for MCLP classification algorithm. Procedia Computer Science, 2013. 17: p. 1047-1054.
32. Tibshirani, R., Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology, 1996. 58(1): p. 267-288.
33. Hoerl, A.E. and R.W. Kennard, Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 1970. 12(1): p. 55-67.
34. Chen, T. and C. Guestrin. Xgboost: A scalable tree boosting system. in Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining. 2016.
35. Breiman, L., Random forests. Machine learning, 2001. 45: p. 5-32.
36. Ali, J., et al., Random forests and decision trees. International Journal of Computer Science Issues (IJCSI), 2012. 9(5): p. 272.
37. Olvera-López, J.A., et al., A review of instance selection methods. Artificial Intelligence Review, 2010. 34: p. 133-143.
38. Wilson, D.R. and T.R. Martinez, Reduction techniques for instance-based learning algorithms. Machine learning, 2000. 38: p. 257-286.
39. García-Pedrajas, N., Constructing ensembles of classifiers by means of weighted instance selection. IEEE Transactions on Neural Networks, 2009. 20(2): p. 258-277.
40. Gates, G., The reduced nearest neighbor rule (corresp.). IEEE transactions on information theory, 1972. 18(3): p. 431-433.
41. Wilson, D.L., Asymptotic properties of nearest neighbor rules using edited data. IEEE Transactions on Systems, Man, and Cybernetics, 1972(3): p. 408-421.
42. Tsai, C.-F., et al., Combining feature selection, instance selection, and ensemble classification techniques for improved financial distress prediction. Journal of Business Research, 2021. 130: p. 200-209.
43. Morales, P., et al., The noisefiltersr package: label noise preprocessing in R. The R Journal, 2017. 9(1): p. 219-228.
44. Hossin, M. and M.N. Sulaiman, A review on evaluation metrics for data classification evaluations. International journal of data mining & knowledge management process, 2015. 5(2): p. 1.
45. Huang, J. and C.X. Ling, Using AUC and accuracy in evaluating learning algorithms. IEEE Transactions on knowledge and Data Engineering, 2005. 17(3): p. 299-310.
46. Fawcett, T., An introduction to ROC analysis. Pattern recognition letters, 2006. 27(8): p. 861-874. |