參考文獻 |
[1]S. B. Kotsiantis, “Supervised machine learning: a review of classification techniques,” Informatica, vol.31, no. 3, pp. 249-268, 2007.
[2]G. P. Zhang, “Neural networks for classification: a survey,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 30, no. 4, pp. 451-462, Nov. 2000.
[3]R. Agrawal, T. Imielinski, and A. Swami, “Database mining: a performance perspective,” IEEE Transactions on Knowledge and Data Engineering, vol. 5, no. 6, pp. 914-925, 1993.
[4]H. Jiawei and M. Kamber, Data mining: concepts and techniques, Morgan Kaufmann Press, New York, 2001.
[5]C. Zhou, W. Xiao, T. M. Tirpak, and P. C. Nelson, “Evolving accurate and compact classification rules with gene expression programming,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 6, pp. 519-531, 2003.
[6]W. H. Au, K. C. C. Chan, and X. Yao, “A novel evolutionary data mining algorithm with applications to churn prediction,” IEEE Transactions on Evolutionary Computation, vol. 7, no.6, pp. 532-545, 2003.
[7]J. A. Abutridy, C. Mellish, and S. Aitken, “A semantically guided and domain-independent evolutionary model for knowledge discovery from texts,” IEEE Transactions on Evolutionary Computation, vol. 7, no.6, pp. 546-560, 2003.
[8]J. R. Cano, F. Herrera, and M. Lozano, “Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 6, pp. 561-575, 2003.
[9]A. Lorenz, M. Blum, H. Ermert, and T. Senge, “Comparison of different neuro-fuzzy classification systems for the detection of prostate cancer in ultrasonic images,” IEEE Proceedings in Ultrasonics Symposium, vol. 2, pp. 1201-1204, 1997.
[10]S. M. Odeh, “Using an adaptive neuro-fuzzy inference system (anfis) algorithm for automatic diagnosis of skin cancer,” in European, Mediterranean & Middle Eastern Conference on Information Systems, 2010.
[11]A. Das and M. Bhattacharya, “A study on prognosis of brain tumors using fuzzy logic and genetic algorithm based techniques,” in 2009 International Joint Conference on Bioinformatics, Systems Biology and Intelligent Computing, pp. 348-351, 2009.
[12]A. Das and M. Bhattacharya, “GA based neuro fuzzy techniques for breast cancer identification,” in International Machine Vision and Image Processing Conference, pp. 136-141, 2008.
[13]C. S. Leslie, E. Eskin, A. Cohen, J. Weston, and W. S. Noble, “Mismatch string kernels for discriminative protein classification,” Bioinformatics, vol. 20, no.4 , pp. 467-476, 2004.
[14]C. Leslie, E. Eskin, and W. Noble, “The spectrum kernel: a string kernel for SVM protein classification,” in Proceedings of the Pacific Symposium on Biocomputing, vol. 7, pp. 566-575, 2002.
[15]K. Tsuda, H. Shin, and B. Scholkopf, “Fast protein classification with multiple networks,” Bioinformatics, vol. 21, no. 2, pp. 59-61, 2005.
[16]R. Karchin, K. Karplus, and D. Haussler, “Classifying G-protein coupled receptors with support vector machines,” Bioinformatics, vol. 18, no. 1, pp.147-159, 2002.
[17]K. Schierholt and C. H. Dagli, “Stock market prediction using different neural network classification architectures,” in Processing IEEE/IAFE 1996 Conference Computing Intelligent Financial Engineering, pp. 72-78, 1996.
[18]A. U. Khan, T. K. Bandopadhyaya, and S. Sharma, “Classification of stocks using self-organizing map,” International Journal of Soft Computing Applications, vol. 4, pp. 19-24, 2009.
[19]C. J. Huang, D. X. Yang, and Y. T. Chuang, “Application of wrapper approach and composite classifier to the stock trend prediction,” Expert Systems with Applications, vol. 34, no. 4, pp. 2870-2878, 2008.
[20]D. Chen, H. Bourlard, and J-Ph. Thiran, “Text identification in complex background using svm,” In International Conference on Computer Vision and Pattern Recognition, pp. 621–626, 2001.
[21]M. Salehpour and A. Behrad, “Cluster Based Weighted SVM for the Recognition of Farsi Handwritten Digits,” in 10th Symposium on Neural Network Applications in Electrical Engineering, pp.219-223, 2010.
[22]B. Zhu, X. D. Zhou, C. L. Liu, and M. Nakagawa, “A robust model for on-line handwritten Japanese text recognition,” International journal on document analysis and recognition, vol. 13, pp. 121-131, 2010.
[23]Y. Lee, H. Song, U. Yang, H. Shin, and K. Sohn. “Local feature based 3d face recognition,” in 2005 International Conference on Audio- and Video-based Biometric Person Authentication, LNCS, vol. 3546, pp. 909–918, 2005.
[24]J. Huang, X. Shao, and H. Wechsler, “Face Pose Discrimination Using Support Vector Machines (SVM),” Proceedings of 14-th International Conference Pattern Recognition, 1998.
[25]M. S. Bartlett, G. Littlewort, C. Lainscsek, I. Fasel, and J. R. Movellan,“Machine learning methods for fully automatic recognition of facial expressions and facial actions,” in Proceedings IEEE International Conference Systems, Man, and Cybernetics, pp. 592–597, 2004.
[26]V. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, 1995.
[27]P. N. Tan, M. Steinbach, and V. Kumar, Introduction to data mining, Addison-Wesley, 2006.
[28]L. Jiang, Z. Cai, D. Wang and S. Jiang, “Survey of improving k-nearest-neighbor for classification,” Proceedings of the Fourth International Conference on Fuzzy Systems and Knowledge Discovery, vol. 1, pp. 679-683, 2007.
[29]J. R. Quinlan, “Induction of decision tree,” Machine Learning, vol. 1, pp. 81-106, 1986.
[30]P. Domingos and M. Pazzani “On the optimality of the simple Bayesian classifier under zero-one loss,” Machine Learning, vol. 29, no. 103–137, 1997.
[31]R. M. Balabin, R. Z. Safieva, E. I. Lomakina, “Gasoline classification using near infrared (NIR) spectroscopy data: Comparison of multivariate techniques,” Analytica Chimica Acta, vol. 671, pp. 27–35, 2010.
[32]M. A. Acevedo, C. J. Corrada-Bravo, H. Corrada-Bravo, L. J. Villanueva-Rivera and T. M. Aide, “Automated classification of bird and amphibian calls using machine learning: A comparison of methods,” Ecological Informatics, vol. 4, pp. 206-214, 2009.
[33]L. Ma, M. M. Crawford, and J. Tian, “Local manifold learning-based k-nearest-neighbor for hyperspectral image classification,” IEEE Transactions on Geoscience and Remote Sensing,” vol. 48, no. 11, pp. 4099–4109, 2010.
[34]J.R. Quinlan, C4.5: Programs for machine learning, Morgan Kaufman, 1993.
[35]D. Steinberg, P. Colla, CART: Classification and Regression Trees, Salford Systems, San Diego, CA, 1997.
[36]H. Liu, F. Hussain, C. L. Tan, and M. Dash, “Discretization: An enabling technique,” Data mining and knowledge discovery, vol. 6, no. 4, pp. 393-423, 2002.
[37]G.H. John and P. Langley, “Estimating continuous distributions in Bayesian Classifiers,” in Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence, Morgan Kaufmann Publishers, San Mateo, pp. 338–345, 1995.
[38]O. C. Hamsici and A. M. Martinez, “Spherical-homoscedastic distributions: the equivalency of spherical and Normal distributions in classification,” Journal of Machine Learning Research, vol. 8, pp. 1583-1623, 2007.
[39]L. Jiang, H. Zhang, and Z. Cai, “A Novel Bayes Model: Hidden Naive Bayes,” IEEE Transactions on Knowledge and Data Engineering, vol. 21, no. 10, pp. 1361-1371, 2009.
[40]O. Pujol and D. Masip, “Geometry-based ensembles: Toward a structural characterization of the classification boundary,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 6, pp. 1140-1146, 2009.
[41]C. M. Bishop and M. E. Tipping, “Variational relevance vector machines,” Uncertainty in Artificial Intelligence Proceedings, pp. 46-53, 2000.
[42]R. S. Parpinelli, H. S. Lopes and A. A. Freitas, “Data mining with an ant colony optimization algorithm,” IEEE Transactions on Evolutionary Computing, vol. 6, no. 4, pp. 321–332, 2002.
[43]J. Bacardit and J.M. Garrell, “Bloat control and generalization pressure using the minimum description length principle for a pittsburgh approach learning classifier system,” in Proceedings of the 6th International Workshop on Learning Classifier Systems, Lecture Notes in Artificial Intelligence, Springer, Berlin, 2003.
[44]H. Su, Y. Yang and L. Zhao, “Classification rule discovery with DE/QDE algorithm,” Expert system win applications, vol. 37, no. 2, pp. 1216-1222, 2010.
[45]T. C. Lin and C. S. Lee, “Neural network based fuzzy logic control and decision system”, IEEE Transactions on Computers, vol. 40, no. 12, pp. 1320-1336, 1991.
[46]S. R. Jang, “ANFIS: adaptive-network-based fuzzy inference system,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 23, no. 3, pp. 665-685, 1993.
[47]M. L. Huang, H. Y. Chen, and J. J. Huang, “Glaucoma detection using adaptive neuro-fuzzy inference system,” Expert Systems with Applications, vol. 32, no. 2, pp. 458-468, 2007.
[48]J.-S. Wang and C. S. G. Lee, “Self-Adaptive Neuro-Fuzzy Inference Systems for Classification Applications,” IEEE Transactions on Fuzzy Systems, vol. 10, no. 6, pp. 790-802, 2002.
[49]R. Nowicki, “On Combining Neuro-Fuzzy Architectures with the Rough Set Theory to Solve Classification Problems with Incomplete Data,” IEEE Transactions on Knowledge and Data Engineering, vol. 20, no. 9, pp. 1239-1253,2008.
[50]L. Kaki , M. Teshnelab and M. A. Shooredeli, “ Classification of Multi-Class Datasets Using 2D Membership Functions in TSK Fuzzy System,” International Journal of Advancements in Computing Technology, vol. 2, no. 1, pp. 33-40, 2010.
[51]H. Mohamadi, J. Habibi, M. S. Abadeh and H. Saadi, “Data mining with a simulated annealing based fuzzy classification system,” Pattern Recognition, vol. 41, no.5, pp. 1824-1833, 2008.
[52]T. G. Dietterich, “Ensemble methods in machine learning,” Multiple Classifier Systems, vol. 1857, pp. 1-15, 2000.
[53]L. Breiman, “Bagging predictors,” Machine learning, vol. 24, pp. 123-140, 1996.
[54]R. E. Schapire, “The boosting approach to machine learning: An overview,” in Processing MSRI Workshop Nonlinear Estimation and Classification, pp. 149-172, 2003.
[55]Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” in Machine Learning: Proceedings of the Thirteenth International Conference, pp. 148-156, 1996.
[56]R. E. Schapire, Y. Freund, P. Bartlett, and W. S. Lee, “Boosting the margin: A new explanation for the effectiveness of voting methods,” The Annals of Statistics, vol. 26, no. 5, pp. 1651-1686, 1998.
[57]Y. Freund, and R. Schapire, “A decision-theoretic generalization of on-line learning and an application to boosting,” Journal of Computer and System Sciences, vol. 55, no. 1, pp. 119-139, 1997.
[58]S.-J. Wang, A. Mathew, Y. Chen, L.-F. Xi, L. Ma and J. Lee, “Empirical analysis of support vector machine ensemble classifiers,” Expert Systems with Applications, vol. 36, no. 3, pp. 6466-6476, 2009.
[59]D. Opitz and R. Maclin, “Popular ensemble methods: an empirical study,” Journal of Artificial Intelligence Research, vol. 11, pp. 169-198, 1999.
[60]J. Friedman, T. Hastie, and R. Tibshirani, “Additive logistic regression: a statistical view of boosting,” The Annals of Statistics, vol. 28, pp. 337-407, 2000.
[61]J. J. Rodrı´guez and L. I. Kuncheva, “Rotation Forest:A New Classifier Ensemble Method,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1619-1630, 2006.
[62]H. Hotelling, “Analysis of a complex of statistical variables into principal components,” Journal of educational psychology, vol. 24, pp. 417-441, 1933.
[63]L. I. Smith, A tutorial on principal component analysis, 2002. Available: http://www.sccg.sk/~haladova/principal_components.pdf. Accessed May 17, 2012.
[64]A. Hyvrinen, “Survey on independent component analysis,” Neural Computing Surveys, vol. 2, no. 4, pp. 94-128, 1999.
[65]K. V. Mardia, J. T. Kent, and J. M. Bibby, Multivariate analysis, Academic Press, Padstow, Cornwall, 1995.
[66]S. T. Roweis and L. K. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” Science, vol. 290, no. 5500, pp. 2323-2326, 2000.
[67]M. Belkin and P. Niyogi, “Laplacian eigenmaps and spectral techniques for embedding and clustering,” Advances in neural information processing systems, vol. 14, pp. 585-591, 2001.
[68]J. B. Tenenbaum, V. De Silva, and J. C. Langford, “A global geometric framework for nonlinear dimensionality reduction,” Science, vol. 290, no. 5500, pp. 2319-2323, 2000.
[69]H. J. Sun, S. R. Wang, and Q. S. Jiang, “FCM-based model selection algorithms for determining the number of clusters,” Pattern recognition, vol. 37, no. 10, pp. 2027-2037, 2004.
[70]C.-S. Li and T.-H. Wu, “Adaptive fuzzy approach to function approximation with PSO and RLSE,” Expert Systems with Applications, vol. 38, no. 10, pp. 13266-13273, 2011.
[71]J. Kennedy and R. Eberhart, “Particle swarm optimization,” IEEE International Conference on Neural Networks Proceedings, vol. 4, pp. 1942-1948, 1995.
[72]J.-S. R. Jang, C.-T. Sun, and E. Mizutani, Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence, Prentice-Hall, Upper Saddle River, NJ, 1997.
[73]P. M. Murphy and D. W. Aha, UCI Repository of Machine Learning Databases, University of California, Department of Information and Computer Science, 1994. Available: http://archive.ics.uci.edu/ml/datasets.html. Accessed June 14, 2012.
[74]K. Pearson, “On lines and planes of closest fit to systems of points in space,” Philosophical Magazine, vol. 2, no. 6, pp. 559–572, 1901.
[75]I. T. Jolliffe, Principal component analysis, 2nd Edition, Springer, New York, 2002.
[76]D. Nauck and R. Kruse, “Neuro-fuzzy systems for function approximation,” Fuzzy Sets and Systems, vol. 101, no. 2, pp. 261–271, 1999.
[77]Z.-H. Xiu and G. Ren, “Stability analysis and systematic design of Takagi-Sugeno fuzzy control systems,” Fuzzy Sets and Systems, vol. 151, no. 1, pp. 119-138, 2005.
[78]R. E. Schapire and Y. Singer, “Improved boosting algorithms using confidence-rated predictions,” Machine learning, vol. 37, no. 3, pp. 297-336, 1999.
[79]J. Kittler, M. Hatef, R. P. W. Duin, and J. Matas, “On combining classifiers,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 3, pp. 226-239, 1998.
[80]D. Bratton and J. Kennedy, “Defining a standard for particle swarm optimization,” Proceedings of the 2007 IEEE Swarm Intelligence Symposium, pp. 120-127, 2007.
[81]A. Banks, J. Vincent, C. Anyakoha, “A review of particle swarm optimization. Part I: background and development,” National Computing, vol. 6, no. 4, pp. 467-484, 2007.
[82]G.H. John and P. Langley, “Estimating continuous distributions in Bayesian classifiers,” in Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence, Morgan Kaufmann Publishers, San Mateo, pp. 338–345, 1995.
|