參考文獻 |
[1] P.N. Tan, M. Steinbach and V. Kumar, “Introduction to data mining,” Addison-Wesley, 2006.
[2] L. Jiang, Z Cai, D Wang and S. Jiang “Survey of improving k-nearest-neighbor for classification,” Fuzzy Systems and Knowledge Discovery, vol. 1, pp. 679-683, August 2007.
[3] V. Vapnik, “The nature of statistical learning theory,” springer, 2000.
[4] Y. Freund and R.E. Schapire, “A decision-theoretic generalization of on-line learning and an application to Boosting,” Journal of Computer and System Sciences, vol. 55, no. 1, pp. 119–139, August 1997.
[5] T.K. Ho, “Random decision forest,” Proceedings of the Third International Conference on Document Analysis and Recognition, vol. 1, pp. 278-282, August 1995.
[6] R. Karchin, K. Karplus and D. Haussler, “Classifying G-protein coupled receptors with support vector machines,” Bioinformatics, vol. 18, no. 1, pp. 147-159, January 2002.
[7] S.M. Odeh, “Using an adaptive neuro-fuzzy inference system (ANFIS) algorithm for automatic diagnosis of skin cancer,” Journal of Communication and Computer, vol. 8, no. 9, pp. 751-755, September 2011.
[8] H. Mamitsuka, “Selecting features in microarray classification using ROC curves, Pattern Recognition, vol. 39, no. 12, pp. 2393-2404, December 2006.
[9] J. Huang, X. Shao and H. Wechsler, “Face pose discrimination using support vector machines (SVM),” Proceedings of the 14th International Conference on Pattern Recognition, vol. 1, pp. 154-156, 1998.
[10] H. Peng, F. Long and C. Ding, “Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 8, pp. 1226-1238, August 2005.
[11] C. Ding and H. Peng, “Minimum redundancy feature selection from microarray gene expression data,” Journal of Bioinformatics and Computational Biology, vol. 3, no. 2 pp. 185-205, April 2005.
[12] A.L. Blum and P. Langley, “Selection of relevant features and examples in machine learning,” Artificial Intelligence, vol. 97, no. 1-2, pp. 245–271, December 1997.
[13] E.P. Xing, M.I. Jordan and R.M. Karp. “Feature selection for high-dimensional genomic microarray data,” International Conference on Machine Learning, vol. 1, pp. 601-608, 2001.
[14] J. Jäger, R. Sengupta, and W.L. Ruzzo, “Improved gene selection for classification of microarrays,” Pacific Symposium on Biocomputing. vol. 8, pp. 53-64, January 2003.
[15] N. Kwak and C.H. Choi, “Input feature selection by mutual information based on Parzen window,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 12, pp. 1667-1671, December 2002.
[16] E. Youn, L. Koenig, M.K. Jeong and S.H. Baek “Support vector-based feature selection using Fisher’s linear discriminant and support vector machine,” Expert Systems with Applications, vol. 37, no.9, pp. 6147-6156, 1994.
[17] J.S.R. Jang, C.T. Sun and E. Mizutani, “Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence,” 1997.
[18] D. Moses, O. Degani, H.N. Teodorescu, M. Friedman and A. Kandel, “Linguistic coordinate transformations for complex fuzzy sets,” Fuzzy Systems Conference Proceedings, vol. 3, pp. 1340-1345, August 1999.
[19] D. Ramot and M. Friedman, “Complex fuzzy sets,” IEEE Transactions on Fuzzy Systems, vol. 10, no. 2, pp. 171-186, April 2002.
[20] D. Ramot, M. Friedman, G. Langholz and A. Kandel, “Complex fuzzy logic,” IEEE Transactions on Fuzzy Systems, vol. 11, no. 4, pp. 450–461, August 2003.
[21] H. Ishibuchi, T. Nakashima and T. Morisawa, “Voting in fuzzy rule-based systems for pattern classification problems,” Fuzzy Sets and Systems vol. 103, no. 2, pp. 223-238, April 1999.
[22] H. Ishibuchi, T. Nakashima and T. Murata, “A fuzzy classifier system that generates fuzzy if-then rules for pattern classification problems,” Evolutionary Computation, IEEE International Conference, vol. 2, pp. 759-764, 1995.
[23] O. Cordon, M.J. del Jesus and F. Herrera, “A proposal on reasoning methods in fuzzy rule-based classification systems,” International Journal of Approximate Reasoning, vol. 20, no. 1, pp. 21-45, January 1999.
[24] C. Li, and T.W. Chiang, “Complex fuzzy computing to time series prediction a multi-swarm PSO learning approach,” Intelligent Information and Database Systems, Springer Berlin Heidelberg, vol. 6592, pp. 242-251, April 2011.
[25] C. Li, and T.W. Chiang, “Complex neuro-fuzzy ARIMA forecasting—A new approach using complex fuzzy sets,” IEEE Transactions on Fuzzy Systems, vol. 21, no. 3, pp. 567-584, June 2013.
[26] C. Li, and T.W. Chiang, “Complex neuro-fuzzy self-learning approach to function approximation,” Intelligent Information and Database Systems Lecture Notes in Computer Science, vol. 5991, pp. 289-299, March 2010.
[27] C. Li and F.T. Chan, “Knowledge discovery by an intelligent approach using complex fuzzy sets,” Intelligent Information and Database Systems Lecture Notes in Computer Science, vol. 7196, pp. 320-329, March 2012.
[28] W.A. Farag, V.H. Quintana and G. Lambert-Torres, “A genetic-based neuro-fuzzy approach for modeling and control of dynamical systems,” IEEE transactions on Neural Networks, vol. 9, no. 5, pp.756-767, September 1999
[29] M. Hall, “Correlation-based feature selection for machine learning,” The University of Waikato, 1999.
[30] T.M. Cover and J.A. Thomas, “Entropy, relative entropy and mutual information,” Elements of Information Theory, pp. 12-49, 1991.
[31] G.H. John and P. Langley, “Estimating continuous distributions in Bayesian classifiers,” Uncertainty in Artificial Intelligence (UAI)′95 Proceedings of the Eleventh conference on Uncertainty in Artificial Intelligence, pp. 338-345, 1995.
[32] N. Kwak and C.H. Choi, “Input feature selection for classification problems,” IEEE Transactions on Neural Networks, vol.13, no. 1, pp. 143-159, January 2002.
[33] R. Battiti, “Using mutual information for selecting features in supervised neural net learning,” IEEE Transactions on Neural Networks, vol. 5 no. 4, pp. 537-550, July 1994.
[34] A.M. Fraser and H.L. Swinney, “Independent coordinates for strange attractors from mutual information,” Physical review A, vol. 33, no. 2, pp. 1134, February 1986.
[35] C. Ding and H. Peng, “Minimum redundancy feature selection from microarray gene expression data,” Journal of Bioinformatics and Computational Biology, vol. 3, no. 2 pp. 185-205, April 2005.
[36] T.M. Cover, “The best two independent measurements are not the two best,” IEEE Transaction Systems on Man and Cybernetics, vol. 4, no. 1, pp. 116-117, January 1974.
[37] A. Jain and D. Zongker, “Feature selection: Evaluation, application, and small sample performance,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153-158, February 1997.
[38] H. Peng, F. Long and C. Ding, “Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 8, pp. 1226-1238, August 2005.
[39] M.T. Hagan, H.B. Demuth and M. Beale, “Neural Network Design,” ISBN 0-534-94332-2, 1996.
[40] S.L. Chiu, “Fuzzy model identification based on cluster estimation,” Journal of Intelligent and Fuzzy System, vol. 2, no. 3, pp. 267-278, 1994.
[41] S. Chopra, R. Mitra and V. Kumar, “Reduction of fuzzy rules and membership functions and its application to fuzzy PI and PD type controllers ,” International Journal of Control, Automation and Systems, vol. 4, no. 4, pp. 438-447, August 2006.
[42] S.L. Chiu, “Extracting fuzzy rules from data for function approximation and pattern classification”, Fuzzy Information Engineering: a Guide Tour of Applications, John Wiley&Sons, pp. 149–162, 1997.
[43] A. Hinneburg and D.A. Keim, “Optimal grid-clustering: Towards breaking the curse of dimensionality in high-dimensional clustering,” Proceeding of the 25th International Conference on Very Large Databases, pp. 506-517, 1999.
[44] M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann and H. Ian, “The WEKA Data Mining Software: An Update,” SIGKDD Explorations, vol. 11, no. 1, 2009.
[45] M. Lichman, “UCI Machine Learning Repository,” [http://archive.ics.uci.edu/ml], Irvine, CA: University of California, School of Information and Computer Science, 2013.
[46] J.J. Rodriguez, L.I. Kuncheva, and C.J. Alonso, “Rotation forest: A new classifier ensemble method,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1619-1630, October 2006.
[47] O. Pujol and D. Masip, “Geometry-based ensembles: Toward a structural characterization of the classification boundary,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 6, pp. 1140-1146, June 2009.
[48] B. Cao, D. Shen, J.T. Sun, Q. Yang and Z. Chen, “Feature Selection in a kernel Space,” Proceedings of the 24th International Conference on Machine Learning, pp. 121-128, 2007.
[49] J. Kennedy and R. Eberhart, “Particle swan optimization,” Proceedings of IEEE International Conference on Neural Networks IV, pp. 1942-1948, 1995.
[50] C. Li and T. Wu, “Adaptive fuzzy approach to function approximation with PSO and RLSE,” Expert Systems with Applications, vol. 38, no. 10, pp. 13266-13273, September 2011.
[51] C. Li and J.W. Hu, “A new ARIMA-based neuro-fuzzy approach and swarm intelligence for time series forecasting,” Engineering Applications of Artificial Intelligence, vol. 25, no. 25 pp. 295-308, March 2012.
[52] S. Chopra, R. Mitra and V. Kumar, “Reduction of fuzzy rules and membership functions and its application to fuzzy PI and PD type controllers ,” International Journal of Control, Automation and Systems, vol. 4, no. 4, pp. 438-447, August 2006.
[53] Y. LeCun, J.S. Denker and S.A. Solla, “Optimal brain damage,” Neural Information Processing Systems, vol. 89, 1989.
[54] D. Koller and M. Sahami, “Toward optimal feature selection,” In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 284–292, 1996.
[55] L.A. Zadeh, “Fuzzy Sets,” Information & Control, vol. 8, no. 3, pp. 338-353, November 1995.
[56] J.S.R. Jang, “ANFIS: Adaptive-network-based fuzzy inference system,” IEEE Transactions on Man and Cybernetics, vol. 23, no. 3, pp. 665–685, June 1993.
|