博碩士論文 110221025 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:57 、訪客IP:3.140.198.173
姓名 王先弘(Xian-Hong Wang)  查詢紙本館藏   畢業系所 數學系
論文名稱 關於一些非線性降維的方法與改進
(On Some Nonlinear Dimensionality Reduction Methods and Improvements)
相關論文
★ 遲滯型細胞神經網路似駝峰行進波之研究★ 穩態不可壓縮那維爾-史托克問題的最小平方有限元素法之片狀線性數值解
★ Global Exponential Stability of Modified RTD-based Two-Neuron Networks with Discrete Time Delays★ 二維穩態不可壓縮磁流體問題的迭代最小平方有限元素法之數值計算
★ 兩種迭代最小平方有限元素法求解不可壓縮那維爾-史托克方程組之研究★ 非線性耦合動力網路的同步現象分析
★ 邊界層和內部層問題的穩定化有限元素法★ 數種不連續有限元素法求解對流佔優問題之數值研究
★ 某個流固耦合問題的有限元素法數值模擬★ 高階投影法求解那維爾-史托克方程組
★ 非靜態反應-對流-擴散方程的高階緊緻有限差分解法★ 二維非線性淺水波方程的Lax-Wendroff差分數值解
★ Numerical Computation of a Direct-Forcing Immersed Boundary Method for Simulating the Interaction of Fluid with Moving Solid Objects★ On Two Immersed Boundary Methods for Simulating the Dynamics of Fluid-Structure Interaction Problems
★ 生成對抗網路在影像填補的應用★ 非穩態複雜流體的人造壓縮性直接施力沉浸邊界法數值模擬
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 降維是指減少數據集中變量的數量,理想情況下接近其內在維度,同時保留原始 數據的有意義特性,它通常是資料科學中模型訓練之前的數據預處理步驟。具體 來說,它可用於資料可視化、集群分析、降噪,或作為促進其他研究的中間步 驟。在這篇論文中,我們簡要介紹主成分分析和線性判別分析兩種線性降維的方 法,和一些主要的非線性降維方法,包括多維尺度擬合、等距映射、擴散映射、 拉普拉斯特徵映射、局部線性嵌入和核主成分分析等方法的推導。此外,我們借 助測地距離對拉普拉斯特徵映射和擴散映射進行了改進,我們還提出了一種選擇 降維維度的方法。最後,我們進行數值實驗並比較各種降維技術。
摘要(英) Dimensionality reduction is reducing the number of variables in a dataset, ideally close to its intrinsic dimension, while retaining meaningful properties of the orig- inal data. It is usually a data preprocessing step before training models in data science. Specifically, it can be used for data visualization, cluster analysis, noise reduction, or as an intermediate step to facilitate other studies. In this thesis, we briefly present the derivations of linear dimensionality reduction methods of the principal component analysis and linear discriminant analysis, and several nonlinear dimensionality reduction methods, including the multidimensional scaling, isometric mapping, diffusion maps, Laplacian eigenmap, locally linear embedding, and ker- nel PCA. Furthermore, we propose modifications to the Laplacian eigenmap and diffusion maps with the help of geodesic distance. We also present a method for selecting the dimension for dimensionality reduction. Finally, we perform numerical experiments and compare the various dimensionality reduction techniques.
關鍵字(中) ★ 維度縮減
★ 主成分分析
★ 線性判別分析
★ 多維尺度擬合
★ 等距映射
★ 擴散映射
★ 拉普拉斯特徵映射
★ 局部線性嵌入
★ 核主成分分析
關鍵字(英) ★ dimensionality reduction
★ principal component analysis
★ linear discriminant analysis
★ multidimensional scaling
★ Isomap
★ diffusion maps
★ Laplacian eigenmap
★ locally linear embedding
★ kernel PCA
論文目次 1 Introduction 1
2 Linear dimensionality reduction methods 4
2.1 Principalcomponentanalysis ...................... 4
2.1.1 Definitionofspread........................ 4
2.1.2 Optimizationproblem ...................... 5
2.2 Lineardiscriminantanalysis ....................... 6
2.2.1 Scattermatrixandspread .................... 6
2.2.2 Optimizationproblem ...................... 7
3 Nonlinear dimensionality reduction methods 9
3.1 Mutidimensionalscaling ......................... 9
3.1.1 Centeringmatrix ......................... 9
3.1.2 MinimizationproblemofMDS.................. 10
3.2 Isometricmapping ............................ 11
3.2.1 MinimizationproblemofIsomap ................ 11
3.2.2 KernelIsomap........................... 11
3.3 Diffusionmaps .............................. 12
3.3.1 Diffusionprocess ......................... 13
3.3.2 Diffusionmap........................... 14
3.4 Laplacianeigenmap............................ 14
3.4.1 Optimizationproblem ...................... 15
3.4.2 Eigenvalueproblem........................ 16
3.5 Locallylinearembedding......................... 16
3.5.1 LLEalgorithm .......................... 17
3.5.2 Optimizingtheweights...................... 17
3.5.3 MappingtolowerdimensionalspaceRd . . . . . . . . . . .18 3.5.4 Eigenvalueproblem........................ 19
3.6 KernelPCA................................ 19
3.6.1 Kernelfunction .......................... 19
3.6.2 Projectiondirection........................ 20
4 Two modified nonlinear methods 23
4.1 Computingthegeodesicdistance .................... 23
4.2 ModifiedLaplacianeigenmap ...................... 24
4.3 Modifieddiffusionmaps ......................... 24
5 Approximating the intrinsic dimension d 26
6 Numerical experiments 28
6.1 S-curve .................................. 28
6.2 Swissroll ................................. 30
6.3 Swissrollwithahole........................... 32
6.4 Irisplantsdataset............................. 34
6.5 Handwritten digits dataset........................ 34
7 Conclusions 38
References 40
參考文獻 [1] D. Calvetti and E. Somersalo, Mathematics of Data Science: A Computational Approach to Clustering and Classification, SIAM, Philadelphia, PA, 2021.
[2] A. M. Martinez and A. C. Kak, PCA versus LDA, IEEE Transactions on Pat- tern Analysis and Machine Intelligence, 23 (2001), pp. 228-233.
[3] J. D. Carroll and J. J. Chang, Analysis of individual differences in multidimen- sional scaling via an n-way generalization of “Eckart-Young” decomposition, Psychometrika, 35 (1970), pp. 283-319.
[4] J. B. Tenenbaum, V. D. Silva, and J. C. Langford, A global geometric frame- work for nonlinear dimensionality reduction, Science, 290 (2000), pp. 2319-2322.
[5] C. Heeyoul and C. Seungjin, Robust kernel Isomap, Pattern Recognition, 40 (2007), pp. 853-862.
[6] F. Cailliez, The analytical solution of the additive constant problem, Psychome- trika, 48 (1983), pp. 305-308.
[7] E. W. Dijkstra, A note on two problems in connexion with graphs, Numerische Mathematik, 1 (1959), pp. 269-271.
[8] R. R. Coifman and S. Lafon, Diffusion maps, Applied and Computational Har- monic Analysis, 21 (2006), pp. 5-30.
[9] J. de la Porte, B. M. Herbst, W. Hereman, and S. J. van der Walt, An introduc- tion to diffusion maps, Proceedings of the Nineteenth Annual Symposium of the Pattern Recognition Association of South Africa (PRASA), 2008, pp. 15-25.
[10] M. Belkin and P. Niyogi, Laplacian eigenmaps and spectral techniques for em- bedding and clustering, Proceedings of the 14th International Conference on Neural Information Processing Systems: Natural and Synthetic, 2001, pp. 585- 591.
[11] M. Belkin and P. Niyogi, Laplacian eigenmaps for dimensionality reduction and data representation, Neural Computation, 15 (2003), pp. 1373-1396.
[12] L. K. Saul and S. T. Roweis, An introduction to locally linear embedding, Journal of Machine Learning Research, 7 (2001).
[13] L. K. Saul and S. T. Roweis, Nonlinear dimensionality reduction by locally linear embedding, Science, 290 (2000), pp. 2323-2326.
[14] R. Rosipal, M. Girolami, L. J. Trejo, and A. Cichocki, Kernel PCA for fea- ture extraction and de-noising in nonlinear regression, Neural Computing & Applications, 10 (2001), pp. 231-243.
[15] B. Scho ̈lkopf, A. Smola, and K. R. Mu ̈ller, Nonlinear component analysis as a kernel eigenvalue problem, Neural Computation, 10 (1998), pp. 1299-1319.
[16] E. J. Cand`es, X. Li, Y. Ma, and J. Wright, Robust principal component anal- ysis?, Journal of the ACM, 58 (2011), Article 11.
指導教授 楊肅煜(Suh-Yuh Yang) 審核日期 2023-7-25
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明