博碩士論文 102423056 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:36 、訪客IP:34.227.112.145
姓名 梁杰榮(Kit-weng Leong)  查詢紙本館藏   畢業系所 資訊管理學系
論文名稱 分類問題之研究-以複數型模糊類神經系統為方法
(A Study on Classification Problem using Complex Neuro-Fuzzy Approach)
相關論文
★ 變數選擇在智慧型系統與應用之研究★ 智慧型系統之參數估測研究─一個新的DE方法
★ 合奏學習式智慧型系統在分類問題之研究★ 複數模糊類神經系統於多類別分類問題之研究
★ 融入後設認知策略的複數模糊認知圖於分類問題之研究★ 智慧型差分自回歸移動平均模型於時間序列預測之研究
★ 計算智慧及複數模糊集於適應性影像處理之研究★ 智慧型模糊類神經計算模式使用複數模糊集合與ARIMA模型
★ Empirical Study on IEEE 802.11 Wireless Signal – A Case Study at the NCU Campus★ 自我建構式複數模糊ARIMA於指數波動預測之研究
★ 資料前處理之研究:以基因演算法為例★ 針對文字分類的支援向量導向樣本選取
★ 智慧型區間預測之研究─以複數模糊類神經、支持向量迴歸、拔靴統計為方法★ 複數模糊類神經網路在多目標財經預測
★ 智慧型模糊類神經計算使用非對稱模糊類神經網路系統與球型複數模糊集★ 複數型模糊類神經系統及連續型態之多蟻群演化在時間序列預測之研究
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 ( 永不開放)
摘要(中) 本研究提出一個複數型模糊類神經系統 (Complex neuro-fuzzy system, CNFS)和採用以資訊理論 (Information theory)為基礎的特徵選取方法應用於分類問題。特徵選取方面以資訊理論為基礎,透過結合最小冗餘和最大相關的概念尋找最佳的特徵子集合。CNFS分類器的建模過程分成結構學習階段和參數學習階段。結構學習階段採用格狀分割法 (Grid partitioning method),為CNFS分類器挑選重要的模糊規則。參數學習階段使用粒子群演算法 (Particle swarm optimization, PSO)和遞迴式最小平方估計器 (Recursive least squares estimator, RLSE)分別調整模型的前鑑部參數 (Premise parameters)和後鑑部參數 (Consequent parameters),稱為PSO-RLSE複合式學習演算法,這方法能使模型在建模過程中迅速收斂,達到快速學習的效果。本研究提出的CNFS分類器結合複數型模糊集合 (Complex fuzzy sets, CFSs)和自適應類神經模糊推理系統的架構(Adaptive neuro-fuzzy inference system, ANFIS),能增加模型的非線性映射能力和提供更靈活的架構。本研究使用美國加州大學爾灣分校 (University of California-Irvine)的機器學習資料庫中十個來自不同領域的資料集來驗證本研究提出的方法,並與其他分類器比較。實驗結果顯示,本研究提出的方法在不同領域的分類問題有優秀的表現。
摘要(英) We present a complex neuro-fuzzy system (CNFS) as a pattern classifier that utilizes complex fuzzy sets. For feature selection of training samples, we consider the removal of redundant and irrelevant features by which we aspire to improve the predictive accuracy of the classifier. Based on information theory, we employ a well-known feature selection method that combines minimal redundancy and maximal relevance for feature selection. One crucial problem for fuzzy-rule based model construction is that the amount of data is usually large in volume, which would make the consequence part parameters of rule base grow exponentially. A modified grid-partitioning method that can select portioned area of input space if some rule-firing-strength threshold is satisfied is employed to deal with that major problem. For the parameter learning method, the particle swarm optimization algorithm (PSO) and the recursive least-squares estimator (RLSE) are integrated as a hybrid learning method to adjust the free parameters of the CNFS effectively. We conducted experiments using 10 data sets of various fields and made performance comparison with other classifiers. The experimental results demonstrate that our approach can find smaller size feature subset with high classification accuracy.
關鍵字(中) ★ 特徵選取
★ 分類
★ 資訊理論
★ 複數型模糊集合
關鍵字(英) ★ feature selection
★ classification
★ information theory
★ complex fuzzy
論文目次 中文摘要 I
英文摘要 II
誌謝 III
目錄 IV
圖目錄 VI
表目錄 VIII
符號列表 X
第一章 緒論 1
1.1 研究背景 1
1.2 研究動機和目的 1
1.3 研究方法概述 3
1.4 論文架構 4
第二章 文獻探討 5
2.1 特徵選取 5
2.1.1 熵 5
2.1.2 條件熵 5
2.1.3 互資訊 6
2.2 基於最小冗餘和最大相關的特徵選取 7
2.2.1 最大相關性 7
2.2.2 最小冗餘性 7
2.2.3 最小冗餘最大相關特徵選取方法 8
2.3 分類器模型 9
2.3.1 模糊集合 9
2.3.2 模糊集合的因由 9
2.3.3 模糊集合的定義 10
2.3.4 複數型模糊集合 11
2.3.5 複數型模糊類神經推理系統 13
第三章 系統學習策略 17
3.1 結構學習階段 17
3.2 參數學習階段 19
3.2.1 粒子群最佳化演算法 19
3.2.2 遞迴最小平方估計器 21
3.2.3 PSO-RLSE複合式學習演算法 22
第四章 實驗 25
4.1 UCI資料集 25
4.2 實驗環境和前置參數之設定 34
4.3 評估CNFS分類器使用不同數量的選取特徵之效能 35
4.4 CNFS分類器與其他分類器比較 42
第五章 討論與結論 45
5.1 mRMR特徵選取方法之探討 45
5.2 複數型模糊集合應用於分類問題 45
5.3 複數型模糊類神經系統應用於分類問題 46
5.4 PSO-RLSE複合式學習演算法之應用 46
5.5 結論 46
第六章 未來研究方向 48
6.1 特徵選取方法之改良 48
6.2 格狀分割法之改良 48
6.3 PSO-RLSE複合式學習演算法之改良 48
參考文獻 50
附錄一 58
附錄二 71
參考文獻 [1] P.N. Tan, M. Steinbach and V. Kumar, “Introduction to data mining,” Addison-Wesley, 2006.
[2] L. Jiang, Z Cai, D Wang and S. Jiang “Survey of improving k-nearest-neighbor for classification,” Fuzzy Systems and Knowledge Discovery, vol. 1, pp. 679-683, August 2007.
[3] V. Vapnik, “The nature of statistical learning theory,” springer, 2000.
[4] Y. Freund and R.E. Schapire, “A decision-theoretic generalization of on-line learning and an application to Boosting,” Journal of Computer and System Sciences, vol. 55, no. 1, pp. 119–139, August 1997.
[5] T.K. Ho, “Random decision forest,” Proceedings of the Third International Conference on Document Analysis and Recognition, vol. 1, pp. 278-282, August 1995.
[6] R. Karchin, K. Karplus and D. Haussler, “Classifying G-protein coupled receptors with support vector machines,” Bioinformatics, vol. 18, no. 1, pp. 147-159, January 2002.
[7] S.M. Odeh, “Using an adaptive neuro-fuzzy inference system (ANFIS) algorithm for automatic diagnosis of skin cancer,” Journal of Communication and Computer, vol. 8, no. 9, pp. 751-755, September 2011.
[8] H. Mamitsuka, “Selecting features in microarray classification using ROC curves, Pattern Recognition, vol. 39, no. 12, pp. 2393-2404, December 2006.
[9] J. Huang, X. Shao and H. Wechsler, “Face pose discrimination using support vector machines (SVM),” Proceedings of the 14th International Conference on Pattern Recognition, vol. 1, pp. 154-156, 1998.
[10] H. Peng, F. Long and C. Ding, “Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 8, pp. 1226-1238, August 2005.
[11] C. Ding and H. Peng, “Minimum redundancy feature selection from microarray gene expression data,” Journal of Bioinformatics and Computational Biology, vol. 3, no. 2 pp. 185-205, April 2005.
[12] A.L. Blum and P. Langley, “Selection of relevant features and examples in machine learning,” Artificial Intelligence, vol. 97, no. 1-2, pp. 245–271, December 1997.
[13] E.P. Xing, M.I. Jordan and R.M. Karp. “Feature selection for high-dimensional genomic microarray data,” International Conference on Machine Learning, vol. 1, pp. 601-608, 2001.
[14] J. Jäger, R. Sengupta, and W.L. Ruzzo, “Improved gene selection for classification of microarrays,” Pacific Symposium on Biocomputing. vol. 8, pp. 53-64, January 2003.
[15] N. Kwak and C.H. Choi, “Input feature selection by mutual information based on Parzen window,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 12, pp. 1667-1671, December 2002.
[16] E. Youn, L. Koenig, M.K. Jeong and S.H. Baek “Support vector-based feature selection using Fisher’s linear discriminant and support vector machine,” Expert Systems with Applications, vol. 37, no.9, pp. 6147-6156, 1994.
[17] J.S.R. Jang, C.T. Sun and E. Mizutani, “Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence,” 1997.
[18] D. Moses, O. Degani, H.N. Teodorescu, M. Friedman and A. Kandel, “Linguistic coordinate transformations for complex fuzzy sets,” Fuzzy Systems Conference Proceedings, vol. 3, pp. 1340-1345, August 1999.
[19] D. Ramot and M. Friedman, “Complex fuzzy sets,” IEEE Transactions on Fuzzy Systems, vol. 10, no. 2, pp. 171-186, April 2002.
[20] D. Ramot, M. Friedman, G. Langholz and A. Kandel, “Complex fuzzy logic,” IEEE Transactions on Fuzzy Systems, vol. 11, no. 4, pp. 450–461, August 2003.
[21] H. Ishibuchi, T. Nakashima and T. Morisawa, “Voting in fuzzy rule-based systems for pattern classification problems,” Fuzzy Sets and Systems vol. 103, no. 2, pp. 223-238, April 1999.
[22] H. Ishibuchi, T. Nakashima and T. Murata, “A fuzzy classifier system that generates fuzzy if-then rules for pattern classification problems,” Evolutionary Computation, IEEE International Conference, vol. 2, pp. 759-764, 1995.
[23] O. Cordon, M.J. del Jesus and F. Herrera, “A proposal on reasoning methods in fuzzy rule-based classification systems,” International Journal of Approximate Reasoning, vol. 20, no. 1, pp. 21-45, January 1999.
[24] C. Li, and T.W. Chiang, “Complex fuzzy computing to time series prediction a multi-swarm PSO learning approach,” Intelligent Information and Database Systems, Springer Berlin Heidelberg, vol. 6592, pp. 242-251, April 2011.
[25] C. Li, and T.W. Chiang, “Complex neuro-fuzzy ARIMA forecasting—A new approach using complex fuzzy sets,” IEEE Transactions on Fuzzy Systems, vol. 21, no. 3, pp. 567-584, June 2013.
[26] C. Li, and T.W. Chiang, “Complex neuro-fuzzy self-learning approach to function approximation,” Intelligent Information and Database Systems Lecture Notes in Computer Science, vol. 5991, pp. 289-299, March 2010.
[27] C. Li and F.T. Chan, “Knowledge discovery by an intelligent approach using complex fuzzy sets,” Intelligent Information and Database Systems Lecture Notes in Computer Science, vol. 7196, pp. 320-329, March 2012.
[28] W.A. Farag, V.H. Quintana and G. Lambert-Torres, “A genetic-based neuro-fuzzy approach for modeling and control of dynamical systems,” IEEE transactions on Neural Networks, vol. 9, no. 5, pp.756-767, September 1999
[29] M. Hall, “Correlation-based feature selection for machine learning,” The University of Waikato, 1999.
[30] T.M. Cover and J.A. Thomas, “Entropy, relative entropy and mutual information,” Elements of Information Theory, pp. 12-49, 1991.
[31] G.H. John and P. Langley, “Estimating continuous distributions in Bayesian classifiers,” Uncertainty in Artificial Intelligence (UAI)′95 Proceedings of the Eleventh conference on Uncertainty in Artificial Intelligence, pp. 338-345, 1995.
[32] N. Kwak and C.H. Choi, “Input feature selection for classification problems,” IEEE Transactions on Neural Networks, vol.13, no. 1, pp. 143-159, January 2002.
[33] R. Battiti, “Using mutual information for selecting features in supervised neural net learning,” IEEE Transactions on Neural Networks, vol. 5 no. 4, pp. 537-550, July 1994.
[34] A.M. Fraser and H.L. Swinney, “Independent coordinates for strange attractors from mutual information,” Physical review A, vol. 33, no. 2, pp. 1134, February 1986.
[35] C. Ding and H. Peng, “Minimum redundancy feature selection from microarray gene expression data,” Journal of Bioinformatics and Computational Biology, vol. 3, no. 2 pp. 185-205, April 2005.
[36] T.M. Cover, “The best two independent measurements are not the two best,” IEEE Transaction Systems on Man and Cybernetics, vol. 4, no. 1, pp. 116-117, January 1974.
[37] A. Jain and D. Zongker, “Feature selection: Evaluation, application, and small sample performance,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153-158, February 1997.
[38] H. Peng, F. Long and C. Ding, “Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 8, pp. 1226-1238, August 2005.
[39] M.T. Hagan, H.B. Demuth and M. Beale, “Neural Network Design,” ISBN 0-534-94332-2, 1996.
[40] S.L. Chiu, “Fuzzy model identification based on cluster estimation,” Journal of Intelligent and Fuzzy System, vol. 2, no. 3, pp. 267-278, 1994.
[41] S. Chopra, R. Mitra and V. Kumar, “Reduction of fuzzy rules and membership functions and its application to fuzzy PI and PD type controllers ,” International Journal of Control, Automation and Systems, vol. 4, no. 4, pp. 438-447, August 2006.
[42] S.L. Chiu, “Extracting fuzzy rules from data for function approximation and pattern classification”, Fuzzy Information Engineering: a Guide Tour of Applications, John Wiley&Sons, pp. 149–162, 1997.
[43] A. Hinneburg and D.A. Keim, “Optimal grid-clustering: Towards breaking the curse of dimensionality in high-dimensional clustering,” Proceeding of the 25th International Conference on Very Large Databases, pp. 506-517, 1999.
[44] M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann and H. Ian, “The WEKA Data Mining Software: An Update,” SIGKDD Explorations, vol. 11, no. 1, 2009.
[45] M. Lichman, “UCI Machine Learning Repository,” [http://archive.ics.uci.edu/ml], Irvine, CA: University of California, School of Information and Computer Science, 2013.
[46] J.J. Rodriguez, L.I. Kuncheva, and C.J. Alonso, “Rotation forest: A new classifier ensemble method,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1619-1630, October 2006.
[47] O. Pujol and D. Masip, “Geometry-based ensembles: Toward a structural characterization of the classification boundary,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 6, pp. 1140-1146, June 2009.
[48] B. Cao, D. Shen, J.T. Sun, Q. Yang and Z. Chen, “Feature Selection in a kernel Space,” Proceedings of the 24th International Conference on Machine Learning, pp. 121-128, 2007.
[49] J. Kennedy and R. Eberhart, “Particle swan optimization,” Proceedings of IEEE International Conference on Neural Networks IV, pp. 1942-1948, 1995.
[50] C. Li and T. Wu, “Adaptive fuzzy approach to function approximation with PSO and RLSE,” Expert Systems with Applications, vol. 38, no. 10, pp. 13266-13273, September 2011.
[51] C. Li and J.W. Hu, “A new ARIMA-based neuro-fuzzy approach and swarm intelligence for time series forecasting,” Engineering Applications of Artificial Intelligence, vol. 25, no. 25 pp. 295-308, March 2012.
[52] S. Chopra, R. Mitra and V. Kumar, “Reduction of fuzzy rules and membership functions and its application to fuzzy PI and PD type controllers ,” International Journal of Control, Automation and Systems, vol. 4, no. 4, pp. 438-447, August 2006.
[53] Y. LeCun, J.S. Denker and S.A. Solla, “Optimal brain damage,” Neural Information Processing Systems, vol. 89, 1989.
[54] D. Koller and M. Sahami, “Toward optimal feature selection,” In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 284–292, 1996.
[55] L.A. Zadeh, “Fuzzy Sets,” Information & Control, vol. 8, no. 3, pp. 338-353, November 1995.
[56] J.S.R. Jang, “ANFIS: Adaptive-network-based fuzzy inference system,” IEEE Transactions on Man and Cybernetics, vol. 23, no. 3, pp. 665–685, June 1993.
指導教授 李俊賢(ChunShien Li) 審核日期 2015-7-15
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明