![]() |
以作者查詢圖書館館藏 、以作者查詢臺灣博碩士 、以作者查詢全國書目 、勘誤回報 、線上人數:42 、訪客IP:18.217.79.15
姓名 林育利(Yu-li Lin) 查詢紙本館藏 畢業系所 光機電工程研究所 論文名稱 使用類神經網路結合支持向量機之分類器研究
(Using Neural Networks with Support Vector Machines to Classifier)相關論文 檔案 [Endnote RIS 格式]
[Bibtex 格式]
[相關文章]
[文章引用]
[完整記錄]
[館藏目錄]
[檢視]
[下載]
- 本電子論文使用權限為同意立即開放。
- 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
- 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
摘要(中) 研究上指出資料本身的特性會直接影響到分類能力。因此我們設計出一種資料研究的方法,將特徵做最好的應用,提高模式識別的應用,以保證類別分離性。本論文結合類神經網路(NN)於特徵擷取之研究,因此我們提出了NN-SVM的演算法來做為我們分類的工具。在NN-SVM演算法中,利用類神經網路將原始資料,映射為一個新增加的資料集,且於分類之前, 對於任何驗證與測試資料也能做相同的轉換。實驗數據顯示,應用代表性指標資料,分類誤差將會被降低。這結果證實代表性的指標提供給特徵擷取額外有價值的資訊。 摘要(英) Several studies have been reported on the characteristics of data sets which are directly correlated with the capability of the classifier. Therefore, a study in the cognition is conceived, and we suggest the feature optimization to guarantee class separability. We present that the available resource of feature extraction concepts of neural networks(NN) can be applied to the feature optimization problem. Thus, we propose the NN-SVM to set a sufficient number of features compensating for the lack of information.
In the NN-SVM algorithm, we use the NN to transform data sets to extract features for support vector machines (SVM) classification. In this way, any validation set and test set subjected to the same transformation before it is classified by the classifier.
The experiments on several existing data sets show that, when the augmented data are utilized, the classification errors estimated are reduced by experimental evidence. This implies that the class labels can be used as extra helpful information to feature extraction.關鍵字(中) ★ 類神經網路
★ 支持向量機關鍵字(英) ★ Support Vector Machines
★ Neural Networks論文目次 目 錄
摘要.....................................................I
Abstract................................................II
目錄...................................................III
圖目錄...................................................V
表目錄................................................VIII
第一章 序論............................................1
1.1 前言............................................1
1.2 研究動機........................................2
1.3 簡介............................................4
1.4 論文架構........................................7
第二章 類神經網路......................................9
2.1 類神經網路........................................9
2.1.1 生物神經元模型..................................9
2.1.2 類神經元模型...................................10
2.1.3 類神經網路架構.................................13
2.2 多層感知機.......................................17
2.2.1 倒傳遞網路架構.................................17
2.2.2 倒傳遞演算法...................................18
2.3 類神經網路的推廣能力.............................23
第三章 支持向量機.....................................25
3.1 支持向量機....................................25
3.2 線性支持向機-處理可區分的二類別分類問題........25
3.3 線性支持向機-處理不可區分的二類別分類問題......29
3.4 非線性支持向量機................................31
3.5 k-fold-cross-validation.........................33
第四章 使用類神經網路結合支持向量機...................34
4.1 基本構想......................................34
4.2 類神經網路-支持向量機..........................35
4.3 操作方法及流程.................................37
第五章 實驗結果與數據探討.............................39
5.1 數據集合與參數設定............................39
5.2 實驗結果與討論................................41
第六章 結論與未來展望.................................61參考文獻 [1] Keinosuke Fukunaga. Introduction to Statistical
Pattern Recognition Press,New York, second edition
,1990.
[2] Jude W. Shavlik and Thomas G. Dietterich, editors.
Readings In Machine Le- arning. Morgan Kaufmann, San
Mateo,CA,1990.
[3] John Hertz, Ander Krogh, and Richard G.Palmer.
Introduction to the theory of Neural Computation.
Addison-Wesley, Redwood City, CA,1991.
[4] Marty Fischler and Oscar Firschein. Readings in
Computer Vision: Issues, Problems, Principles and
Paradigms.Morgan Kaufmann, Sa Mateo,CA,1987.
[5] Louise Stark and Kevin Bowter. Generic Object
Recognition Using Form&Function. World Scientific,
River Edge,NJ,1996.
[6] Lawrence Rabiner and Biing-Hwang Juang. Fundamentals
of Speech Recognition. Prentice-Hall, Englewood
Cliffs,NJ,1993.
[7] George F. Luger. Cognitive Science: The Science of
Intelligent System.Academic Press, New York,1994.
[8] William R. Tveter. The Psychobiology of Sensory
Coding. Harper Collins, New Yor,1973.
[9] Eric R. Kandel and James H. Schwartz. Principles of
Neural Schwartz. Elsecier, New York, secend edition
,1985.
[10]Howard Margolis. Patterns, Thinking, and Cognition; A
Theroy of Judgement. University of Chicago Press,
Chicago, IL,1987.
[11] 蘇木春、張孝德,”機器學習:類神經網路、模糊系統以及基因
演算法則",全華科技圖書股份有限公司,2006
[12] V. N. Vapnik, The nature of statistical learning
theory, Springer-Verlag,Berlin Heidelberg, New York,
1995.
[13] R. Rifin and A. Klautau, In defense of one-vs-all
classification, Journal of Machine Learning Research,
Vol.12, pp 101-141,2004
[14] U. Kreβel, Pairwise classification and support vector
machines, In Advances in Kernel Methods: Support
Vector Learnings, pp 255-268, MIT Press, Cambridge,
MA, 1999.
[15] 鄭又新, 仿螞蟻群聚最佳化神經網路於非線性迴歸之初步探
討,中央大學光機電工程學系碩士論文,2007.
[16] 謝芳伶, 支援向量機於指紋辨識之應用, 靜宜大學資訊管理學
系碩士論文,2004.
[17] B. E. Boser, I. M. Guyon, and V. Vapnik. A training
algorithm for optimal margin classifiers. In Fifth
Annual Workshop on Computational Learning Theory,
Pittsburgh, 1992. ACM.
[18] B. Schölkopf and A. Smola, Learning with kernels:
support vector machines, regularization,
optimization, and beyond.指導教授 張江南(Chiang-Nan Chang) 審核日期 2008-6-26 推文 plurk
funp
live
udn
HD
myshare
netvibes
friend
youpush
delicious
baidu
網路書籤 Google bookmarks
del.icio.us
hemidemi
myshare