參考文獻 |
[1] A. Arnt and S. Zilberstein, “Learning Policies for Sequential Time and Cost Sensitive Classification”, Proceedings of the 1st international workshop on Utility-based data mining, pp. 39-45, 2005.
[2] L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and Regression Trees, Wadsworth, Belmont, CA, 1984.
[3] X. Chai, L. Deng, Q. Yang, and C. X. Ling, “Test-Cost Sensitive Naïve Bayesian Classification”, Proceeding of the 2004 IEEE International Conference on Data Mining 2004.
[4] P. Chan and S. Stolfo, “Toward Scalable Learning with Non-Uniform Class and Cost Distributions”, Proc. 4th Intl. Conf. on Knowledge Discovery and Data Mining, pp. 164-168, New York, 1998.
[5] P. Domingos, “MetaCost: A General Method for Making Classifiers Cost-Sensitive.”, Proceedings of the Fifth International Conference on Knowledge Discovery and Data Mining, pp. 155-164, 1999.
[6] C. Elkan, “The Foundations of Cost-Sensitive Learning”, Proceedings of the Seventeenth International Joint Conference on Artificial Intelligence, pp. 973-978, Seattle, 2001.
[7] J. Gehrke, R. Ramakrishnan, and V. Ganti, “Rainforest: A framework for fast decision tree construction of large datasets”, Proceeding of the 1998 International Conference on Very Large Data Bases, pp. 416-427, 1998.
[8] J. Gehrke, V. Ganti, R. Ramakrishnan, and W. Y. Loh, “BOAT-optimistic decision tree construction”, Proceeding of the 1999 ACM-SIGMOD International Conference. Management of Data, pp. 311-323, 1999.
[9] J. Han and M. Kamber, Data Mining: Concepts and Techniques, Morgan Kaufmann, 2006.
[10] M T. Kai, “Inducing cost-sensitive trees via instance weighting”, Principles of Data Mining and Knowledge Discovery, Second European Symposium, pp. 23-26, Springer-Verlag, 1998.
[11] X. B. Li, “A scalable decision tree system and its application in pattern recognition and intrusion detection”, Decision Support Systems, 41, pp. 112-130, 2005.
[12] C.X. Ling, Q. Yang, J. Wang, and S. Zhang, “Decision Trees with Minimal Costs”, Proceedings of 2004 International Conference on Machine Learning, pp. 69, 2004.
[13] C. X. Ling, V. S. Sheng, and Q. Yang, “Test Strategies for Cost-Sensitive Decision Tree”, IEEE Transactions on Knowledge and Data Engineering, 18(8), pp. 1055-1067, 2006.
[14] C. X. Ling and V. S. Sheng, “Cost-Sensitive Learning and The Class Im-Balance Problem”, Encyclopedia of Machine Learning, Springer, 2008.
[15] W. Y. Loh and Y. S. Shih, “Split selection methods for classification trees”, Statistica Sinica, 7(4), pp. 815-840, 1997.
[16] M. Metha, J. Rissanen, and R. Agrawal, “MDL-based decision tree pruning”, Proceedings of the 1995 International Conference on knowledge Discovery and Data Mining (KDD’95), pp. 216-221, 1995.
[17] M. Metha, R. Agrawal, and J. Rissanen, “SLIQ: A fast scalable classifier for data mining”, Proceedings of the 5th International Conference on Extending Database Technology, 1996.
[18] A. Ni, X. Zhu, and C. Zhang, “Any-Cost Discovery: Learning Optimal Classification Rules”, Australian joint conference on artificial intelligence, 3809, pp. 123-132, Sydney, Australie, 2005.
[19] S. W. Norton, “Generating Better Decision Trees.” Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, pp. 800-805, Detroit, Michigan, 1989.
[20] M. Núñez, “The Use of Background Knowledge in Decision Tree Induction.” Machine Learning, 6, pp. 231-250, 1991.
[21] A. Papagelis and D. Kalles, “GATree: Genetically Evolved Decision Trees”, 12th IEEE International Conference on Tools with Artificial Intelligence (ICTAI’00), pp. 203-206, 2000.
[22] F. Provost, T. Fawcett, and R. Kohavi, “The Case Against Accuracy Estimation for Comparing Induction Algorithms”, Proc. 15th Intl. Conf. in Machine Learning, pp. 445-453, Madison, WI, 1998.
[23] Z. Qin, S. Zhang, and C. Zhang, “Cost-Sensitive Decision Trees with Multiple Cost Scales.” Australian joint conference on artificial intelligence, 3339, pp. 380-390, Cairns , Australie, 2004.
[24] J. R. Quinlan, “Induction of decision trees”, Machine Learning, 1, pp. 81-106, 1986.
[25] J. R. Quinlan, “Simplifying decision trees”, International Journal of Man-Machine Studies, 27, pp. 221-234, 1987.
[26] J. R. Quinlan, C4.5: Programs for Machine Learning, San Mateo, Morgan Kaufmann, 1993.
[27] R. Rastogi and K. Shim, “Public: A decision tree classifier that integrates building and pruning”, Proceedings of the 1998 International Conference on Very Large Data Bases, pp. 404-415, 1998.
[28] J. Shafer, R. Agrawal, and M. Mehta. “SPRINT: A scalable parallel classifier for data mining”, Proceedings of 1996 International Conference on Very Large Data Bases, pp. 544-555, 1996.
[29] V. S. Sheng and C. X. Ling, “Feature Value Acquisition in Testing: A Sequential Batch Test Algorithm” Proceedings of the 23nd International Conference on Machine Learning, pp. 809-816, 2006.
[30] M. Tan and J. Schlimmer, “Cost-Sensitive Concept Learning of Sensor Use in Approach and Recognition”, Proceedings of the Sixth International Workshop on Machine Learning, pp. 392-395, Ithaca, New York, 1989.
[31] M. Tan, “Cost-Sensitive Learning of Classification Knowledge and Its Applications in Robotics”, Machine Learning, 13, pp. 7-33, 1993.
[32] K. M. Ting, “An Instance-Weighting Method to Induce Cost-Sensitive Trees.” IEEE Transactions on Knowledge and Data Engineering, 14(3), pp. 659-665, 2002.
[33] P. D. Turney, “Cost-Sensitive Classification: Empirical Evaluation of A Hybrid Genetic Decision Tree Induction Algorithm”, Journal of Artificial Intelligence Research, 2, pp. 369-409, 1995.
[34] P. D. Turney, “Types of Cost in Inductive Concept Learning”, Workshop on Cost-Sensitive Learning at the Seventeenth International Conference on Machine Learning, pp. 15-21, 2000.
[35] Q. Yang, C. Ling, X. Chai, and R. Pan, “Test-Cost Sensitive Classification on Data with Missing Values”, IEEE Transactions on Knowledge and Data Engineering, 18(5), pp. 626-638, 2006.
[36] S. Zhang, Z. Qin, C. X. Ling, and S. Sheng, “ “Missing is Useful”: Mining Values in Cost-Sensitive Decision Trees”, IEEE Transactions on Knowledge and Data Engineering, 17(12), pp. 1689-1693, 2005.
[37] S. Zhang, X. Zhu, J. Zhang, and C. Zhang, “Cost-Time Sensitive Decision Tree with Missing Values”, KSEM 2007, pp. 447-459, 2007.
[38] H. Zhao, “A Multi-Objective Genetic Programming Approach to Developing Pareto Optimal Decision Trees”, Decision Support System, 43(3), pp. 809-826, 2007.
[39] H. Zhao, “Instance Weighting Versus Threshold Adjusting for Cost-Sensitive Classification.” Knowledge Information System, 15(3), pp. 321-334, 2008.
|