參考文獻 |
1.Yang, Y., & Pedersen, J. O. (1997, July). A comparative study on feature selection in text categorization. In Icml (Vol. 97, No. 412-420, p. 35).
2.Battiti, R. (1994). Using mutual information for selecting features in supervised neural net learning. IEEE Transactions on Neural Networks, 5(4), 537–550.
3.Bong Chih How, & Narayanan, K. (2004). An Empirical Study of Feature Selection for Text Categorization based on Term Weightage. In IEEE/WIC/ACM International Conference on Web Intelligence (WI’04) (pp. 599–602). Beijing, China: IEEE.
4.Estevez, P. A., Tesmer, M., Perez, C. A., & Zurada, J. M. (2009). Normalized Mutual Information Feature Selection. IEEE Transactions on Neural Networks, 20(2), 189–201.
5.Fleuret, F. (2004). Fast binary feature selection with conditional mutual information. Journal of Machine learning research, 5(Nov), 1531-1555.
6.Forman, G. (2003). An extensive empirical study of feature selection metrics for text classification. Journal of machine learning research, 3(Mar), 1289-1305.
7.Ke, S.-W., Lin, W.-C., Tsai, C.-F., & Hu, Y.-H. (2017). Soft estimation by hierarchical classification and regression. Neurocomputing, 234, 27–37.
8.Moon, T. K. (1996). The expectation-maximization algorithm. IEEE Signal Processing Magazine, 13(6), 47–60.
9.Nasser, S., Alkhaldi, R., & Vert, G. (2006). A Modified Fuzzy K-means Clustering using Expectation Maximization. In 2006 IEEE International Conference on Fuzzy Systems (pp. 231–235). Vancouver, BC, Canada: IEEE.
10.Uysal, A. K., & Gunal, S. (2012). A novel probabilistic feature selection method for text classification. Knowledge-Based Systems, 36, 226–235.
11.Vergara, J. R., & Estévez, P. A. (2014). A Review of Feature Selection Methods Based on Mutual Information. Neural Computing and Applications, 24(1), 175–186.
12.Wang, G., & Lochovsky, F. H. (2004). Feature selection with conditional mutual information maximin in text categorization. In Proceedings of the Thirteenth ACM conference on Information and knowledge management - CIKM ’04 (p. 342). Washington, D.C., USA: ACM Press.
13.Wu, G., & Xu, J. (2015). Optimized Approach of Feature Selection Based on Information Gain. In 2015 International Conference on Computer Science and Mechanical Automation (CSMA) (pp. 157–161). Hangzhou, China: IEEE.
14.Xu, R., & WunschII, D. (2005). Survey of Clustering Algorithms. IEEE Transactions on Neural Networks, 16(3), 645–678.
15.Xue, B., Zhang, M., Browne, W. N., & Yao, X. (2016). A Survey on Evolutionary Computation Approaches to Feature Selection. IEEE Transactions on Evolutionary Computation, 20(4), 606–626.
16.Zheng, Z., Wu, X., & Srihari, R. (2004). Feature selection for text categorization on imbalanced data. ACM SIGKDD Explorations Newsletter, 6(1), 80.
17.You, H., & Ryu, T. (2005). Development of a hierarchical estimation method for anthropometric variables. International Journal of Industrial Ergonomics, 35(4), 331–343.
18.Hellier, P., Barillot, C., Memin, E., & Perez, P. (2001). Hierarchical estimation of a dense deformation field for 3-D robust registration. IEEE Transactions on Medical Imaging, 20(5), 388–402.
19.Hamidieh, K. (2018). A data-driven statistical model for predicting the critical temperature of a superconductor. Computational Materials Science, 154, 346-354.
20.Singh, K., Sandhu, R. K., & Kumar, D. (2015). Comment volume prediction using neural networks and decision trees. Proceedings of the 2015 17th UKSIM, 15.
21.Candanedo, L. M., Feldheim, V., & Deramaix, D. (2017). Data driven prediction models of energy use of appliances in a low-energy house. Energy and buildings, 140, 81-97.
22.Graf, F., Kriegel, H. P., Pölsterl, S., Schubert, M., & Cavallaro, A. (2011). Position prediction in ct volume scans. In Proceedings of the 28th International Conference on Machine Learning (ICML) Workshop on Learning for Global Challenges, Bellevue, Washington, WA.
23.Buza, K. (2014). Feedback prediction for blogs. In Data analysis, machine learning and knowledge discovery (pp. 145-152). Springer, Cham.
24.Tsai, C. F. (2009). Feature selection in bankruptcy prediction. Knowledge-Based Systems, 22(2), 120-127.
25.Jin, X., Xu, A., Bie, R., & Guo, P. (2006, April). Machine learning techniques and chi-square feature selection for cancer classification using SAGE gene expression profiles. In International Workshop on Data Mining for Biomedical Applications (pp. 106-115). Springer, Berlin, Heidelberg.
26.Doquire, G., & Verleysen, M. (2013). Mutual information-based feature selection for multilabel classification. Neurocomputing, 122, 148-155.
27.Lee, M. C. (2009). Using support vector machine with a hybrid feature selection method to the stock trend prediction. Expert Systems with Applications, 36(8), 10896-10904.
28.Yun, S., Na, J., Kang, W. S., & Choi, J. (2008, December). Hierarchical estimation for adaptive visual tracking. In 2008 19th International Conference on Pattern Recognition (pp. 1-4). IEEE.
29.Strijbosch, L. W. G., & Moors, J. J. A. (2010). Calculating the accuracy of hierarchical estimation. IMA Journal of Management Mathematics, 21(3), 303-315.
|