博碩士論文 994203040 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:49 、訪客IP:3.144.3.183
姓名 邱紹恩(Shau-en Chiu)  查詢紙本館藏   畢業系所 資訊管理學系
論文名稱 合奏學習式智慧型系統在分類問題之研究
(A Study on Classification Using Neuro-Fuzzy Ensemble Learning)
相關論文
★ 變數選擇在智慧型系統與應用之研究★ 智慧型系統之參數估測研究─一個新的DE方法
★ 複數模糊類神經系統於多類別分類問題之研究★ 融入後設認知策略的複數模糊認知圖於分類問題之研究
★ 分類問題之研究-以複數型模糊類神經系統為方法★ 智慧型差分自回歸移動平均模型於時間序列預測之研究
★ 計算智慧及複數模糊集於適應性影像處理之研究★ 智慧型模糊類神經計算模式使用複數模糊集合與ARIMA模型
★ Empirical Study on IEEE 802.11 Wireless Signal – A Case Study at the NCU Campus★ 自我建構式複數模糊ARIMA於指數波動預測之研究
★ 資料前處理之研究:以基因演算法為例★ 針對文字分類的支援向量導向樣本選取
★ 智慧型區間預測之研究─以複數模糊類神經、支持向量迴歸、拔靴統計為方法★ 複數模糊類神經網路在多目標財經預測
★ 智慧型模糊類神經計算使用非對稱模糊類神經網路系統與球型複數模糊集★ 複數型模糊類神經系統及連續型態之多蟻群演化在時間序列預測之研究
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 ( 永不開放)
摘要(中) 在資料探勘和機器學習中,分類是一個很重要的議題,分類被廣泛地應用在金融、醫學、生物、圖樣辨識等領域。分類模型能有效地建模並且正確地預測未知樣本所屬的類別是很重要的。本研究提出一種結合模糊類神經理論和適應性推進法(Adaptive boosting, AdaBoost)建構一種以模糊類神經系統(Neuro-fuzzy system, NFS)為架構之合奏分類器(Ensemble classifier),並將其應用於分類問題上。本研究提出的合奏分類器(Ensemble classifier)是由NFS元件分類器(Component classifier)所構成。在NFS合奏分類器之建模(Modeling)上,分成結構學習階段和參數學習階段; 在結構學習階段中,使用模糊C平均分裂演算法(FCM-based splitting algorithm, FBSA)來自動決定NFS元件分類器的最佳結構,在參數學習階段中,使用粒子群最佳化(Particle swarm optimization, PSO)來調整NFS元件分類器的前鑑部參數,遞迴最小平方法(Recursive least-squares estimator, RLSE)則被用來調整其後鑑部參數。為了提升系統建模的效率,本研究使用主成分分析(Principal component analysis, PCA)來萃取出重要的屬性特徵,不但可以節省分類器之計算時間還能提升分類正確率。本研究使用加州大學爾灣分校(University of California - Irvine, UCI)機器學習資料庫中的六個資料集來檢驗本研究提出之方法,並與其他著名的研究方法比較分類正確率。實驗結果顯示本研究提出方法有較佳之分類正確率,實證了本論文提出的研究方法有良好的表現。
摘要(英) In data mining and machine learning, classification is an important research issue. Classification has been widely applied in medicine, biology, finance, pattern recognition, and more. It is very important that a classification model can be modeled effectively to predict unseen samples for their classes accurately. In this study, we present a neuro-fuzzy system based ensemble classifier that uses both the theory of neuro-fuzzy system (NFS) and the adaptive boosting algorithm to the problem of classification. The proposed ensemble classifier is composed a set of the NFS component classifiers. The modeling of proposed NFS ensemble classifier comprises the phases of structure learning and parameter learning. In the structure-learning phase, the method of FCM-based splitting algorithm (FBSA) is used to determine the number of If-Then rules for NFS component classifier. In the parameter-learning phase, the PSO-RLSE hybrid learning method is used that comprises the method of particle swarm optimization (PSO) and the algorithm of recursive least squares estimation (RLSE), where PSO is used to adjust the premise parameters of an NFS component classifier and RLSE is used to update the consequent parameters. Moreover, for the purpose of classification performance and computational time reduction, the method of principal component analysis is used to extract important features for the modeling by the proposed approach. In this study, six datasets from the University of California - Irvine (UCI) machine learning repository were used to test the proposed approach, whose results are compared with those by other noted approaches. The proposed approach can get good performance in classification. Through the experimental results, the proposed approach shows excellent performance and outperforms the compared approaches.
關鍵字(中) ★ 主成分分析
★ 模糊類神經系統
★ 混合式學習法
★ 適應性推進法
★ 合奏學習分類
關鍵字(英) ★ adaptive boosting
★ hybrid learning
★ neuro-fuzzy system
★ principal component analysis
★ ensemble learning classification
論文目次 摘要 i
Abstract ii
誌謝 iii
目錄 iv
圖目錄 vi
表目錄 vii
第 1 章 緒論 1
1.1研究背景與動機 1
1.2問題描述與研究方法概述 5
1.3論文架構 6
第 2 章 研究方法 7
2.1主成分分析 7
2.2模糊集合 8
2.3模糊類神經系統 9
2.4適應性推進法 11
2.5模糊C平均分裂演算法 12
2.6混合式學習法 15
2.6.1粒子群最佳化 15
2.6.2遞迴最小平方估計法 16
第 3 章 系統設計與架構 18
3.1模糊類神經系統合奏分類器之架構與設計 18
3.2模糊類神經系統與合奏學習演算法之結合 22
3.3分類決策機制 24
3.4合奏分類器之結構 25
3.5結構學習 27
3.6參數學習 28
第 4 章 實驗 30
4.1實驗一:原始威斯康辛州的乳癌資料集 30
4.2實驗二:國會投票紀錄資料集 38
4.3實驗三:澳大利亞信用評核資料集 43
4.4實驗四:鳶尾花植物資料集 49
4.5實驗五:葡萄酒辨識資料集 54
4.6實驗六:動物園資料集 60
第 5 章 討論 65
第 6 章 結論與未來研究方向 69
6.1結論 69
6.2未來研究方向 71
參考文獻 74
參考文獻 [1]S. B. Kotsiantis, “Supervised machine learning: a review of classification techniques,” Informatica, vol.31, no. 3, pp. 249-268, 2007.
[2]G. P. Zhang, “Neural networks for classification: a survey,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 30, no. 4, pp. 451-462, Nov. 2000.
[3]R. Agrawal, T. Imielinski, and A. Swami, “Database mining: a performance perspective,” IEEE Transactions on Knowledge and Data Engineering, vol. 5, no. 6, pp. 914-925, 1993.
[4]H. Jiawei and M. Kamber, Data mining: concepts and techniques, Morgan Kaufmann Press, New York, 2001.
[5]C. Zhou, W. Xiao, T. M. Tirpak, and P. C. Nelson, “Evolving accurate and compact classification rules with gene expression programming,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 6, pp. 519-531, 2003.
[6]W. H. Au, K. C. C. Chan, and X. Yao, “A novel evolutionary data mining algorithm with applications to churn prediction,” IEEE Transactions on Evolutionary Computation, vol. 7, no.6, pp. 532-545, 2003.
[7]J. A. Abutridy, C. Mellish, and S. Aitken, “A semantically guided and domain-independent evolutionary model for knowledge discovery from texts,” IEEE Transactions on Evolutionary Computation, vol. 7, no.6, pp. 546-560, 2003.
[8]J. R. Cano, F. Herrera, and M. Lozano, “Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 6, pp. 561-575, 2003.
[9]A. Lorenz, M. Blum, H. Ermert, and T. Senge, “Comparison of different neuro-fuzzy classification systems for the detection of prostate cancer in ultrasonic images,” IEEE Proceedings in Ultrasonics Symposium, vol. 2, pp. 1201-1204, 1997.
[10]S. M. Odeh, “Using an adaptive neuro-fuzzy inference system (anfis) algorithm for automatic diagnosis of skin cancer,” in European, Mediterranean & Middle Eastern Conference on Information Systems, 2010.
[11]A. Das and M. Bhattacharya, “A study on prognosis of brain tumors using fuzzy logic and genetic algorithm based techniques,” in 2009 International Joint Conference on Bioinformatics, Systems Biology and Intelligent Computing, pp. 348-351, 2009.
[12]A. Das and M. Bhattacharya, “GA based neuro fuzzy techniques for breast cancer identification,” in International Machine Vision and Image Processing Conference, pp. 136-141, 2008.
[13]C. S. Leslie, E. Eskin, A. Cohen, J. Weston, and W. S. Noble, “Mismatch string kernels for discriminative protein classification,” Bioinformatics, vol. 20, no.4 , pp. 467-476, 2004.
[14]C. Leslie, E. Eskin, and W. Noble, “The spectrum kernel: a string kernel for SVM protein classification,” in Proceedings of the Pacific Symposium on Biocomputing, vol. 7, pp. 566-575, 2002.
[15]K. Tsuda, H. Shin, and B. Scholkopf, “Fast protein classification with multiple networks,” Bioinformatics, vol. 21, no. 2, pp. 59-61, 2005.
[16]R. Karchin, K. Karplus, and D. Haussler, “Classifying G-protein coupled receptors with support vector machines,” Bioinformatics, vol. 18, no. 1, pp.147-159, 2002.
[17]K. Schierholt and C. H. Dagli, “Stock market prediction using different neural network classification architectures,” in Processing IEEE/IAFE 1996 Conference Computing Intelligent Financial Engineering, pp. 72-78, 1996.
[18]A. U. Khan, T. K. Bandopadhyaya, and S. Sharma, “Classification of stocks using self-organizing map,” International Journal of Soft Computing Applications, vol. 4, pp. 19-24, 2009.
[19]C. J. Huang, D. X. Yang, and Y. T. Chuang, “Application of wrapper approach and composite classifier to the stock trend prediction,” Expert Systems with Applications, vol. 34, no. 4, pp. 2870-2878, 2008.
[20]D. Chen, H. Bourlard, and J-Ph. Thiran, “Text identification in complex background using svm,” In International Conference on Computer Vision and Pattern Recognition, pp. 621–626, 2001.
[21]M. Salehpour and A. Behrad, “Cluster Based Weighted SVM for the Recognition of Farsi Handwritten Digits,” in 10th Symposium on Neural Network Applications in Electrical Engineering, pp.219-223, 2010.
[22]B. Zhu, X. D. Zhou, C. L. Liu, and M. Nakagawa, “A robust model for on-line handwritten Japanese text recognition,” International journal on document analysis and recognition, vol. 13, pp. 121-131, 2010.
[23]Y. Lee, H. Song, U. Yang, H. Shin, and K. Sohn. “Local feature based 3d face recognition,” in 2005 International Conference on Audio- and Video-based Biometric Person Authentication, LNCS, vol. 3546, pp. 909–918, 2005.
[24]J. Huang, X. Shao, and H. Wechsler, “Face Pose Discrimination Using Support Vector Machines (SVM),” Proceedings of 14-th International Conference Pattern Recognition, 1998.
[25]M. S. Bartlett, G. Littlewort, C. Lainscsek, I. Fasel, and J. R. Movellan,“Machine learning methods for fully automatic recognition of facial expressions and facial actions,” in Proceedings IEEE International Conference Systems, Man, and Cybernetics, pp. 592–597, 2004.
[26]V. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, 1995.
[27]P. N. Tan, M. Steinbach, and V. Kumar, Introduction to data mining, Addison-Wesley, 2006.
[28]L. Jiang, Z. Cai, D. Wang and S. Jiang, “Survey of improving k-nearest-neighbor for classification,” Proceedings of the Fourth International Conference on Fuzzy Systems and Knowledge Discovery, vol. 1, pp. 679-683, 2007.
[29]J. R. Quinlan, “Induction of decision tree,” Machine Learning, vol. 1, pp. 81-106, 1986.
[30]P. Domingos and M. Pazzani “On the optimality of the simple Bayesian classifier under zero-one loss,” Machine Learning, vol. 29, no. 103–137, 1997.
[31]R. M. Balabin, R. Z. Safieva, E. I. Lomakina, “Gasoline classification using near infrared (NIR) spectroscopy data: Comparison of multivariate techniques,” Analytica Chimica Acta, vol. 671, pp. 27–35, 2010.
[32]M. A. Acevedo, C. J. Corrada-Bravo, H. Corrada-Bravo, L. J. Villanueva-Rivera and T. M. Aide, “Automated classification of bird and amphibian calls using machine learning: A comparison of methods,” Ecological Informatics, vol. 4, pp. 206-214, 2009.
[33]L. Ma, M. M. Crawford, and J. Tian, “Local manifold learning-based k-nearest-neighbor for hyperspectral image classification,” IEEE Transactions on Geoscience and Remote Sensing,” vol. 48, no. 11, pp. 4099–4109, 2010.
[34]J.R. Quinlan, C4.5: Programs for machine learning, Morgan Kaufman, 1993.
[35]D. Steinberg, P. Colla, CART: Classification and Regression Trees, Salford Systems, San Diego, CA, 1997.
[36]H. Liu, F. Hussain, C. L. Tan, and M. Dash, “Discretization: An enabling technique,” Data mining and knowledge discovery, vol. 6, no. 4, pp. 393-423, 2002.
[37]G.H. John and P. Langley, “Estimating continuous distributions in Bayesian Classifiers,” in Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence, Morgan Kaufmann Publishers, San Mateo, pp. 338–345, 1995.
[38]O. C. Hamsici and A. M. Martinez, “Spherical-homoscedastic distributions: the equivalency of spherical and Normal distributions in classification,” Journal of Machine Learning Research, vol. 8, pp. 1583-1623, 2007.
[39]L. Jiang, H. Zhang, and Z. Cai, “A Novel Bayes Model: Hidden Naive Bayes,” IEEE Transactions on Knowledge and Data Engineering, vol. 21, no. 10, pp. 1361-1371, 2009.
[40]O. Pujol and D. Masip, “Geometry-based ensembles: Toward a structural characterization of the classification boundary,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 6, pp. 1140-1146, 2009.
[41]C. M. Bishop and M. E. Tipping, “Variational relevance vector machines,” Uncertainty in Artificial Intelligence Proceedings, pp. 46-53, 2000.
[42]R. S. Parpinelli, H. S. Lopes and A. A. Freitas, “Data mining with an ant colony optimization algorithm,” IEEE Transactions on Evolutionary Computing, vol. 6, no. 4, pp. 321–332, 2002.
[43]J. Bacardit and J.M. Garrell, “Bloat control and generalization pressure using the minimum description length principle for a pittsburgh approach learning classifier system,” in Proceedings of the 6th International Workshop on Learning Classifier Systems, Lecture Notes in Artificial Intelligence, Springer, Berlin, 2003.
[44]H. Su, Y. Yang and L. Zhao, “Classification rule discovery with DE/QDE algorithm,” Expert system win applications, vol. 37, no. 2, pp. 1216-1222, 2010.
[45]T. C. Lin and C. S. Lee, “Neural network based fuzzy logic control and decision system”, IEEE Transactions on Computers, vol. 40, no. 12, pp. 1320-1336, 1991.
[46]S. R. Jang, “ANFIS: adaptive-network-based fuzzy inference system,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 23, no. 3, pp. 665-685, 1993.
[47]M. L. Huang, H. Y. Chen, and J. J. Huang, “Glaucoma detection using adaptive neuro-fuzzy inference system,” Expert Systems with Applications, vol. 32, no. 2, pp. 458-468, 2007.
[48]J.-S. Wang and C. S. G. Lee, “Self-Adaptive Neuro-Fuzzy Inference Systems for Classification Applications,” IEEE Transactions on Fuzzy Systems, vol. 10, no. 6, pp. 790-802, 2002.
[49]R. Nowicki, “On Combining Neuro-Fuzzy Architectures with the Rough Set Theory to Solve Classification Problems with Incomplete Data,” IEEE Transactions on Knowledge and Data Engineering, vol. 20, no. 9, pp. 1239-1253,2008.
[50]L. Kaki , M. Teshnelab and M. A. Shooredeli, “ Classification of Multi-Class Datasets Using 2D Membership Functions in TSK Fuzzy System,” International Journal of Advancements in Computing Technology, vol. 2, no. 1, pp. 33-40, 2010.
[51]H. Mohamadi, J. Habibi, M. S. Abadeh and H. Saadi, “Data mining with a simulated annealing based fuzzy classification system,” Pattern Recognition, vol. 41, no.5, pp. 1824-1833, 2008.
[52]T. G. Dietterich, “Ensemble methods in machine learning,” Multiple Classifier Systems, vol. 1857, pp. 1-15, 2000.
[53]L. Breiman, “Bagging predictors,” Machine learning, vol. 24, pp. 123-140, 1996.
[54]R. E. Schapire, “The boosting approach to machine learning: An overview,” in Processing MSRI Workshop Nonlinear Estimation and Classification, pp. 149-172, 2003.
[55]Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” in Machine Learning: Proceedings of the Thirteenth International Conference, pp. 148-156, 1996.
[56]R. E. Schapire, Y. Freund, P. Bartlett, and W. S. Lee, “Boosting the margin: A new explanation for the effectiveness of voting methods,” The Annals of Statistics, vol. 26, no. 5, pp. 1651-1686, 1998.
[57]Y. Freund, and R. Schapire, “A decision-theoretic generalization of on-line learning and an application to boosting,” Journal of Computer and System Sciences, vol. 55, no. 1, pp. 119-139, 1997.
[58]S.-J. Wang, A. Mathew, Y. Chen, L.-F. Xi, L. Ma and J. Lee, “Empirical analysis of support vector machine ensemble classifiers,” Expert Systems with Applications, vol. 36, no. 3, pp. 6466-6476, 2009.
[59]D. Opitz and R. Maclin, “Popular ensemble methods: an empirical study,” Journal of Artificial Intelligence Research, vol. 11, pp. 169-198, 1999.
[60]J. Friedman, T. Hastie, and R. Tibshirani, “Additive logistic regression: a statistical view of boosting,” The Annals of Statistics, vol. 28, pp. 337-407, 2000.
[61]J. J. Rodrı´guez and L. I. Kuncheva, “Rotation Forest:A New Classifier Ensemble Method,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1619-1630, 2006.
[62]H. Hotelling, “Analysis of a complex of statistical variables into principal components,” Journal of educational psychology, vol. 24, pp. 417-441, 1933.
[63]L. I. Smith, A tutorial on principal component analysis, 2002. Available: http://www.sccg.sk/~haladova/principal_components.pdf. Accessed May 17, 2012.
[64]A. Hyvrinen, “Survey on independent component analysis,” Neural Computing Surveys, vol. 2, no. 4, pp. 94-128, 1999.
[65]K. V. Mardia, J. T. Kent, and J. M. Bibby, Multivariate analysis, Academic Press, Padstow, Cornwall, 1995.
[66]S. T. Roweis and L. K. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” Science, vol. 290, no. 5500, pp. 2323-2326, 2000.
[67]M. Belkin and P. Niyogi, “Laplacian eigenmaps and spectral techniques for embedding and clustering,” Advances in neural information processing systems, vol. 14, pp. 585-591, 2001.
[68]J. B. Tenenbaum, V. De Silva, and J. C. Langford, “A global geometric framework for nonlinear dimensionality reduction,” Science, vol. 290, no. 5500, pp. 2319-2323, 2000.
[69]H. J. Sun, S. R. Wang, and Q. S. Jiang, “FCM-based model selection algorithms for determining the number of clusters,” Pattern recognition, vol. 37, no. 10, pp. 2027-2037, 2004.
[70]C.-S. Li and T.-H. Wu, “Adaptive fuzzy approach to function approximation with PSO and RLSE,” Expert Systems with Applications, vol. 38, no. 10, pp. 13266-13273, 2011.
[71]J. Kennedy and R. Eberhart, “Particle swarm optimization,” IEEE International Conference on Neural Networks Proceedings, vol. 4, pp. 1942-1948, 1995.
[72]J.-S. R. Jang, C.-T. Sun, and E. Mizutani, Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence, Prentice-Hall, Upper Saddle River, NJ, 1997.
[73]P. M. Murphy and D. W. Aha, UCI Repository of Machine Learning Databases, University of California, Department of Information and Computer Science, 1994. Available: http://archive.ics.uci.edu/ml/datasets.html. Accessed June 14, 2012.
[74]K. Pearson, “On lines and planes of closest fit to systems of points in space,” Philosophical Magazine, vol. 2, no. 6, pp. 559–572, 1901.
[75]I. T. Jolliffe, Principal component analysis, 2nd Edition, Springer, New York, 2002.
[76]D. Nauck and R. Kruse, “Neuro-fuzzy systems for function approximation,” Fuzzy Sets and Systems, vol. 101, no. 2, pp. 261–271, 1999.
[77]Z.-H. Xiu and G. Ren, “Stability analysis and systematic design of Takagi-Sugeno fuzzy control systems,” Fuzzy Sets and Systems, vol. 151, no. 1, pp. 119-138, 2005.
[78]R. E. Schapire and Y. Singer, “Improved boosting algorithms using confidence-rated predictions,” Machine learning, vol. 37, no. 3, pp. 297-336, 1999.
[79]J. Kittler, M. Hatef, R. P. W. Duin, and J. Matas, “On combining classifiers,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 3, pp. 226-239, 1998.
[80]D. Bratton and J. Kennedy, “Defining a standard for particle swarm optimization,” Proceedings of the 2007 IEEE Swarm Intelligence Symposium, pp. 120-127, 2007.
[81]A. Banks, J. Vincent, C. Anyakoha, “A review of particle swarm optimization. Part I: background and development,” National Computing, vol. 6, no. 4, pp. 467-484, 2007.
[82]G.H. John and P. Langley, “Estimating continuous distributions in Bayesian classifiers,” in Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence, Morgan Kaufmann Publishers, San Mateo, pp. 338–345, 1995.
指導教授 李俊賢(Chunshien Li) 審核日期 2012-7-22
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明