博碩士論文 110221005 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:41 、訪客IP:52.15.164.238
姓名 蕭子胤(Tz-Yin Shiau)  查詢紙本館藏   畢業系所 數學系
論文名稱 分散共識支持向量機之研究
(A Study on Distributed Consensus Support Vector Machine)
相關論文
★ 遲滯型細胞神經網路似駝峰行進波之研究★ 穩態不可壓縮那維爾-史托克問題的最小平方有限元素法之片狀線性數值解
★ Global Exponential Stability of Modified RTD-based Two-Neuron Networks with Discrete Time Delays★ 二維穩態不可壓縮磁流體問題的迭代最小平方有限元素法之數值計算
★ 兩種迭代最小平方有限元素法求解不可壓縮那維爾-史托克方程組之研究★ 非線性耦合動力網路的同步現象分析
★ 邊界層和內部層問題的穩定化有限元素法★ 數種不連續有限元素法求解對流佔優問題之數值研究
★ 某個流固耦合問題的有限元素法數值模擬★ 高階投影法求解那維爾-史托克方程組
★ 非靜態反應-對流-擴散方程的高階緊緻有限差分解法★ 二維非線性淺水波方程的Lax-Wendroff差分數值解
★ Numerical Computation of a Direct-Forcing Immersed Boundary Method for Simulating the Interaction of Fluid with Moving Solid Objects★ On Two Immersed Boundary Methods for Simulating the Dynamics of Fluid-Structure Interaction Problems
★ 生成對抗網路在影像填補的應用★ 非穩態複雜流體的人造壓縮性直接施力沉浸邊界法數值模擬
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 支持向量機是一種有效的二元分類器,在各種應用中具有出色的性能。然而,在
處理大型資料集或來自分散來源的資料時,計算資源的限制和資料隱私問題可能
會阻礙支持向量機的效能。在本文中,我們研究分散共識支持向量機,它具有兩
個主要優點:它允許每位工作者在不共享資料的情況下推導出更通用的超平面,
從而保持資料的隱私;它能夠將大型問題分解為可處理的子問題,透過分散式計
算提高處理速度。儘管如此,由於分散共識支持向量機的目標函數的不可微性而
面臨挑戰,我們採用了平滑支持向量機的處理方法來解決這個問題,結合平滑函
數來增強目標函數的可微性,從而產生了所謂的分散共識平滑支持向量機,它利
用1-範數進行懲罰計算,優化效率和準確性。最後,我們透過多次數值實驗驗證
該演算法的效能。
摘要(英) The support vector machine (SVM) is an effective binary classifier with excellent
performance across various applications. However, computing limitations and data
privacy concerns can hinder SVM’s performance when handling large datasets or
data from distributed sources. In this thesis, we study the distributed consensus
SVM, which offers two primary advantages: it maintains data privacy by allowing
each worker to derive a more generalized hyperplane without data sharing, and it
enables the decomposition of large-scale problems into manageable sub-problems
for enhanced processing speed through distributed computing. Nonetheless, the
distributed consensus SVM faces challenges due to the non-differentiability of its
objective function. We adopt the smoothing SVM approach to address this issue,
incorporating a smoothing function to enhance function differentiability. It leads
to the so-called distributed consensus smoothing SVM, which utilizes the 1-norm
for penalty calculation, optimizing efficiency and accuracy. Finally, we validate the
performance of this algorithm through several numerical experiments.
關鍵字(中) ★ 支持向量機
★ 核方法
★ 分散共識問題
★ 交錯方向乘子法
★ 隱私保護
關鍵字(英) ★ support vector machine
★ kernel method
★ distributed consensus problem
★ alternating direction method of multipliers
★ privacy preserving
論文目次 1 Introduction 1
2 SupportVectorMachine 4
2.1 HardMarginLinearSVM . . . . . . . . . . . . . . . . . . . . . . . . 4
2.2 SoftMarginLinearSVM. . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 SmoothingSVM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.4 KernelMethod . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.5 KernelSVM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3 DistributedConsensusOptimizationProblem 15
3.1 AlternatingDirectionMethodofMultipliers . . . . . . . . . . . . . . 15
3.2 DistributedConsensusOptimizationProblem . . . . . . . . . . . . . 17
4 DistributedConsensusSupportVectorMachine 19
4.1 DistributedConsensusLinearSVM . . . . . . . . . . . . . . . . . . . 19
4.2 DistributedConsensusKernelSVM. . . . . . . . . . . . . . . . . . . 21
4.3 DistributedConsensusReducedKernelSVM. . . . . . . . . . . . . . 22
5 NumericalExperiments 24
5.1 DistributedConsensusLinearSVM . . . . . . . . . . . . . . . . . . . 24
5.2 DistributedConsensusKernelSVM. . . . . . . . . . . . . . . . . . . 27
5.3 DistributedConsensusReducedKernelSVM. . . . . . . . . . . . . . 32
6 Conclusion 34
References 36
參考文獻 [1] A. Beck, Introduction to Nonlinear Optimization: Theory, Algorithms, and Ap
plications with Matlab, MOS-SIAM Series on Optimization, SIAM, Philadel
phia, PA, 2014.
[2] B. E. Boser, I. M. Guyon, and V. N. Vapnik, A training algorithm for optimal
margin classifiers, COLT ’92: Proceedings of the Fifth Annual Workshop on
Computational Learning Theory, July 1992, pp. 144-152.
[3] S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein, Distributed opti
mization and statistical learning via the ADMM, Foundations and Trends in
Machine Learning, 3 (2010), pp. 1-122.
[4] J. Broniarek, SVM-parallel consensus and sharing optimization, Project Report,
Signal Processing@Data Science, February 2020.
[5] D. Calvetti and E. Somersalo, Mathematics of Data Science: A Computational
Approach to Clustering and Classification, SIAM, Philadelphia, PA, 2021.
[6] C. Chen and O. L. Mangasarian, A class of smoothing functions for nonlinear
and mixed complementarity problems, Computational Optimization and Appli
cations, 5 (1996), pp. 97-138.
[7] C. Chen and O. L. Mangasarian, Smoothing methods for convex inequalities
and linear complementarity problems, Mathematical Programming, 71 (1995),
pp. 51-69.
[8] H.-H. Chen, Distributed consensus reduced support vector machine, Master
Thesis, National Chiao Tung University, July 2019.
[9] L. J. Chien, Y. J. Lee, Z. P. Kao, and C. C. Chang, Robust 1-norm soft mar
gin smooth support vector machine, In: C. Fyfe et al. (eds.), Intelligent Data
Engineering and Automated Learning– IDEAL 2010, pp. 145-152, 2010.
37
[10] L. R. Jen and Y.-J. Lee, Clustering model selection for reduced support vec
tor machines, In: Z. R. Yang et al. (eds.), Intelligent Data Engineering and
Automated Learning– IDEAL 2004, pp. 714-719, 2004.
[11] Y.-J. Lee and S.-Y. Huang, Reduced support vector machines: A statistical
theory, IEEE Transactions on Neural Networks, 18 (2007), pp. 1-13.
[12] Y.-J. Lee and O. L. Mangasarian, SSVM: A smooth support vector machine
for classification, Computational Optimization and Applications, 20 (2001), pp.
5-22.
[13] Y.-J. Lee and O. L. Mangasarian, RSVM: reduced support vector machines,
Proceedings of the 2001 SIAM International Conference on Data Mining
(SDM), April 2001.
[14] J. Li, E. Elhamifar, I.-J. Wang, and R. Vidal, Consensus with robustness to
outliers via distributed optimization, 49th IEEE Conference on Decision and
Control (CDC), Atlanta, GA, USA, 2010, pp. 2111-2117.
[15] X. Liu, Y. Ding, and F. S. Bao, General scaled support vector machines, ICMLC
2011: Proceedings of the 3rd International Conference on Machine Learning and
Computing, Piscataway, NJ, 2011.
[16] H.-Y. Lo, Incremental reduced support vector machines, Master Thesis, Na
tional Taiwan University of Science and Technology, July 2004.
[17] O. L. Mangasarian, Generalized support vector machines, Mathematics Pro
gramming Technical Reports, Department of Computer Sciences, University of
Wisconsin-Madison, October 1998.
[18] S. W. Purnami, J. M. Zain, and A. Embong, Reduced support vector machine
based on k-mode clustering for classification large categorical dataset, In: J. M.
Zain et al. (eds.), Software Engineering and Computer Systems, ICSECS 2011,
Part II, CCIS 180, pp. 694-702, 2011.
38
[19] F. Rahimi, Support vector machines with convex combination of kernels, Master
Thesis, Concordia University, Canada, 2018.
[20] A. Rumpf, Overview of dual ascent, Short Notes, Department of Applied Math
ematics, Illinois Institute of Technology, Chicago, IL, February 19, 2019.
指導教授 楊肅煜(Suh-Yuh Yang) 審核日期 2024-7-23
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明