博碩士論文 965201104 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:31 、訪客IP:3.16.79.33
姓名 何誌祥(Chih-Hsiang Ho)  查詢紙本館藏   畢業系所 電機工程學系
論文名稱 基於貝氏資訊之萃取應用於支持向量機之類不平衡分類問題
(Design of Bayesian-based Knowledge Extraction for SVMs in Unbalanced Classifications)
相關論文
★ 影像處理運用於家庭防盜保全之研究★ 適用區域範圍之指紋辨識系統設計與實現
★ 頭部姿勢辨識應用於游標與機器人之控制★ 應用快速擴展隨機樹和人工魚群演算法及危險度於路徑規劃
★ 智慧型機器人定位與控制之研究★ 基於人工蜂群演算法之物件追蹤研究
★ 即時人臉偵測、姿態辨識與追蹤系統實現於複雜環境★ 基於環型對稱賈柏濾波器及SVM之人臉識別系統
★ 改良凝聚式階層演算法及改良色彩空間影像技術於無線監控自走車之路徑追蹤★ 模糊類神經網路於六足機器人沿牆控制與步態動作及姿態平衡之應用
★ 四軸飛行器之偵測應用及其無線充電系統之探討★ 結合白區塊視網膜皮層理論與改良暗通道先驗之單張影像除霧
★ 基於深度神經網路的手勢辨識研究★ 人體姿勢矯正項鍊配載影像辨識自動校準及手機接收警告系統
★ 模糊控制與灰色預測應用於隧道型機械手臂之分析★ 模糊滑動模態控制器之設計及應用於非線性系統
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 在一般典型的分類(Classification)問題中,通常只有一些模糊且攏統的資訊,以及一些來自於待分類樣本的特定子集合,所以必須利用這些現有的資訊,尋找有效的方法設計出正確的分類器。其中,支持向量機(Support Vector Machines;SVMs)在處理資料的類重疊(Class Overlap)上擁有優異的分類性能,但是關於類別資料不平衡(Data Imbalance)的資訊,在支持向量機裡是無法得知的。由於,貝葉斯決策理論(Bayesian Decision Theory)擁有樣本在機率統計上的有效訊息,因此本篇文章利用了貝葉斯決策理論的輔助,將之與支持向量機做結合,根據非線性規劃(Nonlinear Programming),演算出貝氏支持向量機(Bayesian Support Vector Machines;BSVMs)。此外貝葉斯決策平面(Decision surface)方程式具有穿越點x0(Passing through point)這個有用的訊息,它可以清楚反映出類別間樣本數量不平衡的情形。因此,為了將穿越點x0的資訊充分應用在支持向量機上,藉以改變了原本支持向量機超平面(Hyperplane)方程式的型式,進而發展出貝氏支持向量機。貝葉斯決策理論的穿越點x0表現出類不平衡的情況;支持向量機的非線性規劃方法解決了類重疊的問題,使得此分類器訊息更加完善。
在本實驗模擬中,將貝氏支持向量機之穿越點x0分成三種類型,其實驗結果皆與支持向量機進行比較。結果顯示它可以用較小懲罰參數c(Penalty parameter)使得測試誤差能最小,也就是說有較平滑(Smooth)的超平面,這樣亦可以避免過學習(Overfitting)的情形,所以貝氏支持向量機擁有較好的分類能力。
摘要(英) Classification approaches usually present the poor generalization performance with an apparent class imbalance problem. There are many researches reported in the literature that the characteristics of data sets have strongly influenced the performance of different classifiers. Therefore, a study in the cognition is conceived, and it is naturally suggested the Bayesian methods participated. It easily captures valuable quantitative features from data sets, and then to update these previous classification problems to guarantee well class separability. The purpose of this learning method was to lead an attractive pragmatic feature of the Bayesian approach in the quantitative description of class imbalance problem. Thus, a programming problem of mixing probability information: Bayesian Support Vector Machines (BSVMs) is discussed. In the framework, we must change some of the aims and conditions of the original programming problems and study what results will be achieved because of the change. The experiments on several existing data sets show that, when prior distributions are assigned to the programming problem, the estimated classification errors are reduced by experimental evidence.
關鍵字(中) ★ 類不平衡
★ 分類
★ 支持向量機
★ 貝葉斯理論
關鍵字(英) ★ Support Vector Machines
★ Bayesian Decision Theory
★ Classification
★ Umbalance
論文目次 CONTENTS .................................................I
LIST OF FIGURES .........................................IV
LIST OF TABLES ..........................................VI
CHAPTER 1 INTRODUCTION ...................................1
1.1 Background ...........................................1
1.2 Literature Review ....................................2
1.3 Motivation ...........................................3
1.4 Problem Formulation ..................................4
1.5 Organization .........................................5
CHAPTER 2 BAYESIAN DECISION THEORY AND MAXIMUM LIKELIHOOD ESTIMATION ...........................6
2.1 A Brief of Bayes’Theorem ............................6
2.2 Bayesian Decision Theory .............................7
2.2.1 Discriminat Functions and Decision Boundaries ......7
2.2.2 Case 1: Diagonal Covariance Matrix with Equal
Elements ...........................................9
2.2.3 Case 2: Nondiagonal and Equal Covariance Matrix ...12
2.3 Estimation of Unknown Probability Density Functions .13
2.3.1 Maximum Likelihood Estimation .....................13
2.3.2 The Case of a Gaussian Distribution with Unknown μ
and Σ .............................................14
CHAPTER 3 SUPPORT VECTOR MACHINES FRAMEWORKS ............16
3.1 Linear Support Vector Machines: separable case ......16
3.2 Linear Support Vector Machines: non-separable case ..20
3.3 Nonlinear Support Vector Machines ...................25
3.3.1 The Kernel Trick ..................................25
3.3.2 The Nonlinear Model of SVMs .......................27
3.4 Learning Curves .....................................31
CHAPTER 4 BAYESIAN SUPPORT VECTOR MACHINES ..............34
4.1 Concepts of the Classifier ..........................34
4.2 Derivation of Bayesian SVMs .........................35
CHAPTER 5 EXPERIMENTAL RESULTS AND DISCUSSIONS ..........39
5.1 Datasets ............................................39
5.2 Comparison of BSVMs between SVMs ....................40
5.2.1 Training and Testing Phases .......................41
5.2.2 Experimental Data .................................48
5.2.3 Performance on One-against-All Problems ...........51
5.3 Discussions .........................................52
CHAPTER 6 CONCLUSIONS AND RECOMMENDATIONS ...............54
REFERENCES ..............................................55
APPENDIX I DATA SET .....................................58
LIST OF PUBLICATIONS ....................................63
參考文獻 [1]V. N. Vapnik, The Nature of Statistical Learning Theory, Springer-Verlag, Berlin Heidelberg, New York, 1995.
[2]V. N. Vapnik, “An Overview of Statistical Learning Theory,” IEEE Transaction on Neural Networks, Vol. 10, pp. 988-999, 1999.
[3]V. N. Vapnik, Statistical Learning Theory, Wiley, New York, 1998.
[4]M. Kubat and S. Matwin, “Addressing the Course of Imbalanced Training Sets: One-sided Selection,” In Proceedings of the 14th International Conference on Machine Learning, pp. 179−186, 1997.
[5]C. F. Lin and S. D. Wang, “Fuzzy Support Vector Machine,” IEEE Transaction on Neural Networks, Vol. 13, No.2, pp. 464−471, March 2002.
[6]Che-Chang Hsu, Ming-Feng Han, Shih-Hsing Chang, and Hung-Yuan Chung, “Fuzzy Support Vector Machines with the Uncertainty of Parameter C,” Expert Systems with Applications, Vol.36, Issue 3, part 2, pp. 6654−6658, April 2009.
[7]Ming-Feng Han, “Fuzzy Support Vector Machines with the Uncertainty of Parameter C,” Master Thesis, Department of Electrical Engineering, National Central University, Taiwan, 2008.
[8]R. C. Prati, G. E. A. P. A. Batista, and M. C. Monard, “Class imbalances versus class overlapping: An analysis of a learning system behavior.” In MICAI, 2004, pp. 312−321.
[9]B. Sch?lkopf, P. Simard, A. Smola and V. Vapnik, “Prior Knowledge in Support Vector Kernels,” In M. Jordan, M. Kearns, S. Solla, editors. Advances in Neural Information Processing System 10. MIT Press, pp. 312−321, 1998.
[10]Boyang LI, Liangpeng MA, Jinglu HU and Kotaro HIRASAWA, “Gene Classification Using An Improved SVM Classifier with Soft Decision Boundary,” SICE Annual Conference 2008, pp. 2476−2480, August 2008.
[11]Gang Wu and Edward Y. Chang, “KBA: Kernel boundary alignment considering imbalanced data distribution,” IEEE Transaction on Knowledge and Data Engineering, Vol. 17, No. 6, pp. 786−795, June 2005.
[12]Qing Tao, Gao-Wei Wu, Fei-Yue Wang, and Jue Wang, “Posterior Probability Support Vector Machines for Unbalanced data,” IEEE Transactions on Neural Networks, Vol. 16, No. 6, pp.1561−1573, 2005.
[13]C. K. I. Williams, and D. Barber, “Bayesian classification with Gaussian process,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 20, No. 12, pp.1342−1351, 1998.
[14]M. Seeger, “Bayesian model selection for Support Vector Machines, Gaussian processes and other kernel classifiers,” In Advances in Neural Information Processing Systems, Cambridge, MA, 2000, Vol. 12, pp.603−609.
[15]P. Sollich, “Bayesian methods for Support Vector Machines: Evidence and predictive class probabilities,” Machine Learning, Vol. 46, No. 1, pp. 21−52, 2002.
[16]R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, 2nd edition, John Wiley and Sons Inc, 2000.
[17]S. Theodoridis, and K. Koutroumbas, Pattern Recognition, Third Edition, Academic Press, 2006.
[18]B. Sch?lkopf and A. J. Smola, Learning with Kernel: Support Vector Machines, Regularization, Optimization, and Beyond, MIT Press, Cambridge, MA, 2001, pp. 25−60.
[19]B. Sch?lkopf and A. J. Smola, K. R. M?ller, “Nonlinear Component Analysis as a Kernel Eigenvalue Problem,” Neural Computing, Vol. 10, No. 5, pp. 1299−1319, 1998.
[20]T. Hastie, R. Tibshirani, and J. Friendman, The Element of Statistical Learning: Data Mining, Inference and Prediction, Springer−Verlag, Berlin Heidelberg New York, pp. 214−217, 2001.
[21]L. Breiman, Bias, Variance and Arcing Classifiers. Technical Report 460, Berkeley, CA: Statistics Department, University of California at Berkeley, 1996.
[22]P. M. Murphy, UCI−Benchmark Repository of Artificial and Real Data Sets, http://www.ics.uci.edu/~mlearn, University of California Irvine, CA, 1995.
[23]P. Vlachos, and M. Meyer, StatLib, http://lib.stat.cmu.edu/, Department of Statistics, Carnegie Mellon University, 1989.
指導教授 鍾鴻源(Hung-Yuan Chung) 審核日期 2009-6-27
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明