博碩士論文 965201102 詳細資訊


姓名 郭彥鋒(Yan-Fong Kuo)  查詢紙本館藏   畢業系所 電機工程學系
論文名稱 不同結構學習演算法之類神經分類器之比較
(Comparisons of Neural Network Classifiers Based on Learning Algorithms with Different Structures)
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 本篇論文旨在探討與評估類神經網路分類器,且探討並分析。靜態類神經網路和動態類神經網路兩種不同形式分類器。此兩種分類器本質上是屬於不同的結構和演算形式。靜態類神經網路在結構上,為一固定結構,即其網路神經元數是靠經驗法則、人工給定的,而動態模糊類神經網路在結構上為一種動態調整,其網路神經元數是靠一系列的學習規則所衍生的。其中倒傳遞網路,主要對學習演算法採用Levenberg-Marquardt 方法改善演算的收斂速度。其次,動態類神經網路中探討部分,分為兩部分:架構的學習和參數的學習,在其架構上用刪減技巧使結構更精簡、更容易去實現。最後在實驗結果方面,利用UCI 樣本資料庫進行分類處理,以評估兩種分類器的準確率。
摘要(英) This thesis aims to investigate and evaluate neural network classifiers, especially on back propagation neural network and dynamic fuzzy neuralnetwork. And we further analyze and improve of both classifiers to ensure the high accuracy of internet. In back propagation neural network, we mainly focus on the learning algorithm and adopt the Levenberg-Marquart method to improve the performance. Moreover, the discussion of the dynamic fuzzy neural network could be divided into two parts: structure learning and parameter learning. The optimal parameter learning is the main work in this study. And it is used by the pruning techniques for dynamic fuzzy neural network structure and would lead to an easy operation for internet, structure simplification and facilitating the
accomplishment. Finally, from the experimental results, the classification is made from the UCI database to evaluate the accuracy of both back propagation neural network and dynamic fuzzy neural network classifiers.
關鍵字(中) ★ 倒傳遞類神經網路
★ 動態模糊類神經網路
★ 刪減技巧
關鍵字(英) ★ back propagation neural network
★ dynamic fuzzy neural network
★ pruning technique
論文目次 第一章 緒論 .............................................. 1
1.1 研究背景 .............................................1
1.2 研究動機與目的 .......................................1
1.3 主要貢獻 .............................................3
1.4 論文架構 .............................................4
第二章 類神經網路的架構和特性 ............................5
2.1 人腦的神經系統 .......................................5
2.1.1 人工神經元架構 .................................. 5
2.1.2 人工神經元處理模式 ............................. 7
2.2 類神經網路的學習規則 .................................9
2.2.1 類神經網路學習演算法分類 ....................... 9
第三章 基於類神經之分類器 .............................. 11
3.1 簡介 ................................................11
3.2 線性分類器 ..........................................11
3.2.1 感知機 ........................................ 11
3.2.2 感知機之激勵函數 .............................. 12
3.2.3 感知機演算法 .................................. 13
3.2.4 感知機之最佳化 ................................ 15
第四章 靜態倒傳遞類神經網路 ............................ 18
4.1 簡介 ................................................18
4.2 倒傳遞網路架構 ..................................... 18
4.2.1 倒傳遞網路之激勵函數 .......................... 19
4.2.2 倒傳遞網路演算法 .............................. 21
4.2.3 倒傳遞網路之設計 .............................. 24
4.3 倒傳遞網路改良方法 ..................................29
4.3.1 附加動量法 .................................... 29
4.3.2 Levenberg-Marquardt 法 ....................... 32
4.3.3 訓練演算的收斂速度對比 ........................ 33
第五章動態模糊類神經網路 ................................36
5.1 簡介 ................................................36
5.2 模糊系統 ........................................... 36
5.2.1 模糊集 ........................................ 37
5.2.2 模糊規則 ...................................... 37
5.2.3 模糊推論系統 .................................. 38
5.3 徑向基類神經網路 ....................................40
5.4 動態模糊類神經網路 ................................. 42
5.5 動態模糊類神經網路學習演算法 ....................... 45
5.5.1 規則產生準則 ............................... 45
5.5.2 分級學習思想 ............................... 47
5.5.3 前提參數分配 ............................... 48
5.5.4 結果參數的確定 ............................. 50
5.5.5 網路刪除技巧之改良 ......................... 51
第六章實驗結果與討論 ....................................55
6.1 辨識率之評估 .................................... 55
6.1.1 實驗一 .................................... 56
6.1.2 實驗二 .................................... 57
6.1.3 實驗三 .................................... 59
6.2 討論 ............................................. 60
第七章結論與未來展望 ................................... 61
7.1 結論 ............................................. 61
7.2 未來展望 ......................................... 62
參考文獻 ............................................. 63
參考文獻 [1] F. Rosenblatt, “The perceptron:A probabilistic for information storage and organization in the brain, ” Psychological
Review, vol.65, pp.386-408, 1985.
[2] A.E. Bryson, Y. C. HO, Applied optimal control, New York: Blaisdell, 1969.
[3] G.I .Webb, Multiboosting:A technique for combining boosting and wagging, Mach Learn, pp.159-196, 2000.
[4] P.J. Werbos, Beyond regression: New tools for prediction and analysis in the behavioral sciences, MA:Harvard University, 1974.
[5] D.E. Rumelhart, G.E Hinton, R.J Williams, “ Learning
respresentations of back propagation errors, ”Nature, pp.533-536, 1986.
[6] Z.M. Tan,“A study on Video Servo Control Systems, "Master Thesis, Department of Mechanical and Electro-Mechanical
Engineering, National Sun Yat-Sen University, Taiwan, 2007.
[7] S.I. Gallant,“Neural Network Learning and Expert Systems, "The MIT Press, Massachusetts, 1993.
[8] P.M Murphy, UCI-Benchmark Repository of Artificial and Real Data Sets, http://www.ics.uci.edu/~mlearn, University of Cal
[9] D. Plaut,S.Nowlaw,and D.Hinton, “ Experiment on learning by back-propagation, ”Technical Report CMU-CS-86-126 , Department of Computer Scinece, Carnegic Mellon University,Pittsburgh, PA, 1986.
[10] K.A. Levenberg, method for the solution of certain problem in least squares, Quart.Apple.Math., pp.164-168, 1994.
[11] C.M. Bishop, Neural Networks for pattern Recognition, Oxford University Press, 1995.
[12] B. Hassibi, D.G. Stock, G.J. Wolff, “Optimal brain surgeon and general network purning, ”Proceedings IEEE Conference on Neural Networks, vol. 1 , San Francisco, 1993.
[13] J.Zurada, Introduction to Artificial Neural Networks, West Publishing Company, St.Paul, MN., 1992.
[14] J.S.Jang, C.T. Sun, “Functional Equivalence between Radial Basis Function Networks and Fuzzy Inference Systems, ”IEEE trans.Neural Network, pp.156-158, 1993.
[15] G.C Goodwin, K.S. Sin, “Adaptive Filtering Prediction and Control, ” Englewood Cliffs, NJ:Prenticeh-Hall, 1984.
[16] H. Akaike, “ A New Look at the statistical Model
Indentification , ”IEEE trans.Automat.Contr., pp.716-723, 1974.
[17] D.Angluin,C.smith,“Inductive Infrernce:Theory and Methods,”ACM comput.Surv.,pp.716-723,1984.
[18] J.Moody, C.J. Darken, “Fast Learning in Network of Locally-Tuned Processing Units, Neural Computation ,pp.281-294, 1989.ifornia Irvine.CA, 1995.
[19] S.Q. Wu, M.J. Er, “ Dynamic Fuzzy Neural Networks:A Novel
Approach to Function Approximation, ”IEEE trans.Syst, Man,
Cybern.Part B., pp358-364, 2000.
[20] M.J. Er, S.Q. Wu,“A Fast Learning Algorithm for Parsimonious Fuzzy Neural System, ” Fuzzy sets and Systems, pp.337-351, 2002.
[21] S. Lee, R.M. Kil, “ A Gaussian Potential Function
Network with Hierarchically Self-Organizing Learning, ” Neural Networks , pp.207-224, 1991.
[22] Y. Lu, N. Sundararajan, P.A. Saratchandran, “Sequential Learning Scheme for Function Approximation by Using Minimal Radial Basis Function Networks, ”Neural Computation, pp.461-478, 1997.
[23] T.Hastie, R.Tibshirani, and J.Friendman, The Element of statistical Learning : Data Mining, Inference and Prediction, Springer-Verlag. Berlin Heidelberg New York, pp.214-217, 2001.
指導教授 鍾鴻源(Hung-Yuan Chung) 審核日期 2009-10-15
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡