參考文獻 |
Aha, D. W. (1992). Tolerating noisy, irrelevant and novel attributes in instance-based learning algorithms. International Journal of Man-Machine Studies, 36(2), 267-287.
Aha, D. W., Kibler, D., & Albert, M. K. (1991). Instance-Based Learning Algorithms. Machine Learning, 6(1), 37-66.
Baker, J. E. (1987). Reducing bias and inefficiency in the selection algorithm. Paper presented at the Proceedings of the second international conference on genetic algorithms.
Batista, G. E. A. P. A., & Monard, M. C. (2002). A Study of K-Nearest Neighbour as an Imputation Method. HIS, 87, 251-260.
Bishop, C. M. (2006). Pattern recognition and machine learning (Vol. 1): springer New York.
Burges, C. J. (1998). A tutorial on support vector machines for pattern recognition. Data mining and knowledge discovery, 2(2), 121-167.
Chang, C.-C., & Lin, C.-J. (2011). LIBSVM: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology (TIST), 2(3), 27.
Chen, J., & Shao, J. (2000). Nearest neighbor imputation for survey data. JOURNAL OF OFFICIAL STATISTICS-STOCKHOLM-, 16(2), 113-132.
Cortes, C., & Vapnik, V. N. (1995). Support-vector networks. Machine Learning, 20(3), 273-297.
Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. Information Theory, IEEE Transactions on, 13(1), 21-27.
De Jong, K. A. (1975). Analysis of the behavior of a class of genetic adaptive systems.
Dempster, A. P., Laird, N. M., & Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the royal statistical society. Series B (methodological), 1-38.
Duda, R. O., Hart, P. E., & Stork, D. G. (2012). Pattern classification: John Wiley & Sons.
Fayyad, U., Piatetsky-Shapiro, G., & Smyth, P. (1996). From Data Mining to Knowledge Discovery in Databases. AI Magazine, 17(3).
Fletcher, R. (2013). Practical methods of optimization: John Wiley & Sons.
Frawley, W. J., Piatetsky-Shapiro, G., & Matheus, C. (1992). Knowledge Discovery
in Databases An Overview. AI Magazine, 17(3).
Garcia, S., Derrac, J., Cano, J. R., & Herrera, F. (2012). Prototype selection for
nearest neighbor classification: Taxonomy and empirical study. Pattern Analysis
and Machine Intelligence, IEEE Transactions on, 34(3), 417-435.
Gates, G. W. (1972). The reduced nearest neighbor rule. IEEE Transactions on
Information theory, 18(3), 431-433.
Gen, M., & Cheng, R. (2000). Genetic algorithms and engineering optimization (Vol.
7): John Wiley & Sons.
Goldberg, D. E. (1989). Genetic Algorithms in Search, Optimization, and
Machine Learning.
Goldberg, D. E., & Holland, J. H. (1988). Genetic algorithms and machine learning.
Machine Learning, 3(2), 95-99.
Han, J., & Moraga, C. (1995). The influence of the sigmoid function parameters on
the speed of backpropagation learning From Natural to Artificial Neural Computation (pp. 195-201): Springer.
Hart, P. (1968). The condensed nearest neighbor rule. IEEE Transactions on Information theory, 14, 515-516.
Herrera, F., Lozano, M., & Verdegay, J. L. (1998). Tackling real-coded genetic algorithms: Operators and tools for behavioural analysis. Artificial Intelligence Review, 12(4), 265-319.
Holland, J. H. (1992). Adaptation in Natural and Artificial Systems.
Jain, A. K., Duin, R. P. W., & Mao, J. (2000). Statistical pattern recognition: A review. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 22(1),
4-37.
Jiawei, H., & Kamber, M. (2001). Data mining: concepts and techniques. San
Francisco, CA, itd: Morgan Kaufmann, 5.
Kohavi, R. (1995). A study of cross-validation and bootstrap for accuracy estimation
and model selection. Paper presented at the Ijcai.
Kuncheva, L. I., & Sánchez, J. S. (2008). Nearest Neighbour Classifiers for Streaming
Data with Delayed Labelling. Paper presented at the ICDM.
Lakshminarayan, K., Harp, S. A., & Samad, T. (1999). Imputation of Missing Data in
Industrial Databases. Applied Intelligence, 11(3), 259-275.
Lee, H., Rancourt, E., & Särndal, C. E. (1994). Experiments with variance estimation
from survey data with imputed values. JOURNAL OF OFFICIAL
STATISTICS-STOCKHOLM-, 10, 231-231.
Liepins, G. E., & Vose, M. D. (1992). Characterizing crossover in genetic algorithms.
Annals of Mathematics and Artificial Intelligence, 5(1), 27-34.
Little, R. J., & Rubin, D. B. (2002). Statistical analysis with missing data.
Little, R. J., & Rubin, D. B. (2014). Statistical analysis with missing data: John Wiley
& Sons.
Mistiaen, J. A., & Ravallion, M. (2003). Survey compliance and the distribution of income. World Bank policy research working paper(2956).
Mitchell, M. (1998). An introduction to genetic algorithms: MIT press.
Mitchell, T. M. (1997). Machine learning. Burr Ridge, IL: McGraw Hill, 45.
Nocedal, J. W., Stephen J. (1999). Numerical Optimization. Springer Verlag. Olvera-López, J. A., Carrasco-Ochoa, J. A., Martínez-Trinidad, J. F., & Kittler, J.
(2010). A review of instance selection methods. Artificial Intelligence Review,
34(2), 133-143.
Pal, N. R., & Jain, L. E. (2005). Advanced techniques in data mining and knowledge
discovery. London: Springer.
Patcha, A., & Park, J.-M. (2007). An overview of anomaly detection techniques:
Existing solutions and latest technological trends. Computer Networks, 51(12),
3448-3470.
Pyle, D. (1999). Data preparation for data mining (Vol. 1): Morgan Kaufmann. Quinlan, J. R. (1987). Generating Production Rules from Decision Trees. Paper
presented at the IJCAI.
Ritter, G., Woodruff, H., Lowry, S., & Isenhour, T. (1975). An algorithm for a
selective nearest neighbor decision rule. IEEE Transactions on Information
theory, 21(6), 665-669.
Rubin, D. B. (1976). Inference and missing data. Biometrika, 63(3), 581-592.
Rubin, D. B. (1996). Multiple imputation after 18+ years. Journal of the American
Statistical Association, 91(434), 473-489.
Rubin, D. B. (2004). Multiple imputation for nonresponse in surveys (Vol. 81): John
Wiley & Sons.
Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1985). Learning internal
representations by error propagation: DTIC Document. 54
Schafer, J. L. (1997). Analysis of incomplete multivariate data: CRC press.
Schafer, J. L., & Graham, J. W. (2002). Missing data: our view of the state of the art.
Psychological methods, 7(2), 147.
Suykens, J. A., & Vandewalle, J. (1999a). Least squares support vector machine
classifiers. Neural processing letters, 9(3), 293-300.
Suykens, J. A. K., & Vandewalle, J. (1999b). Least squares support vector machine
classifiers. Neural processing letters, 9(3), 293-300.
Syswerda, G. (1989). Uniform crossover in genetic algorithms. Paper presented at the
Proceedings of the 3rd International Conference on Genetic Algorithms.
Tanner, M. A., & Wong, W. H. (1987). The calculation of posterior distributions by data augmentation. Journal of the American Statistical Association, 82(398),
528-540.
Tomek, I. (1976). An experiment with the edited nearest-neighbor rule. Systems, Man
and Cybernetics, IEEE Transactions on, 6(6), 448-452.
Tsai, C.-F., & Chen, Z.-Y. (2014). Towards high dimensional instance selection: An
evolutionary approach. Decision Support Systems, 61, 79-92.
Tsai, C.-F., Eberle, W., & Chu, C.-Y. (2013). Genetic algorithms in feature and
instance selection. Knowledge-Based Systems, 39, 240-247.
University of California, I. (2015). UCI Machine Learning Repository. Retrieved
5/19, 2015, from http://archive.ics.uci.edu/ml/
Vázquez, F., Sánchez, J. S., & Pla, F. (2005). A stochastic approach to Wilson’s
editing algorithm Pattern Recognition and Image Analysis (pp. 35-42): Springer. Vapnik, V. N. (1998a). Statistical learning theory (Vol. 1): Wiley New York.
Vapnik, V. N. (1998b). The support vector method of function estimation Nonlinear
Modeling (pp. 55-85): Springer.
Vapnik, V. N. (1999). An overview of statistical learning theory. Neural Networks, IEEE Transactions on, 10(5), 988-999.
Vapnik, V. N. (2000). The nature of statistical learning theory: Springer Science & Business Media.
Werbos, P. J. (1989). Backpropagation and neurocontrol: A review and prospectus. Paper presented at the Neural Networks, 1989. IJCNN., International Joint Conference on.
Wilson, D. L. (1972). Asymptotic properties of nearest neighbor rules using edited data. Systems, Man and Cybernetics, IEEE Transactions on(3), 408-421.
Wilson, D. R., & Martinez, T. R. (2000). Reduction Techniques for Instance-Based Learning Algorithms. Machine Learning, 38(3), 257-286.
Witten, I. H., & Frank, E. (2005). Data Mining: Practical machine learning tools and techniques: Morgan Kaufmann.
Wu, X., Kumar, V., Quinlan, J. R., Ghosh, J., Yang, Q., Motoda, H., . . . Philip, S. Y. (2008). Top 10 algorithms in data mining. Knowledge and Information Systems, 14(1), 1-37.
Wu, X., Zhu, X., Wu, G.-Q., & Ding, W. (2014). Data mining with big data. Knowledge and Data Engineering, IEEE Transactions on, 26(1), 97-107. |