博碩士論文 109281602 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:56 、訪客IP:18.219.116.93
姓名 黃德國(Huynh Duc Quoc)  查詢紙本館藏   畢業系所 數學系
論文名稱 擬牛頓法在非線性最小平方、對稱非線性方程組和最近似相關矩陣問題的應用
(A family of quasi-Newton methods for solving nonlinear least-squares, symmetric nonlinear equations, and the nearest correlation matrix problems)
相關論文
★ 非線性塊狀高斯消去牛頓演算法在噴嘴流體的應用★ 以平行 Newton-Krylov-Schwarz 演算法解 Poisson-Boltzmann 方程式的有限元素解在膠體科學上的應用
★ 最小平方有限元素法求解對流擴散方程以及使用Bubble函數的改良★ Bifurcation Analysis of Incompressible Sudden Expansion Flows Using Parallel Computing
★ Parallel Jacobi-Davidson Algorithms and Software Developments for Polynomial Eigenvalue Problems in Quantum Dot Simulation★ An Inexact Newton Method for Drift-DiffusionModel in Semiconductor Device Simulations
★ Numerical Simulation of Three-dimensional Blood Flows in Arteries Using Domain Decomposition Based Scientific Software Packages in Parallel Computers★ A Parallel Fully Coupled Implicit Domain Decomposition Method for the Stabilized Finite Element Solution of Three-dimensional Unsteady Incompressible Navier-Stokes Equations
★ A Study for Linear Stability Analysis of Incompressible Flows on Parallel Computers★ Parallel Computation of Acoustic Eigenvalue Problems Using a Polynomial Jacobi-Davidson Method
★ Numerical Study of Algebraic Multigrid Methods for Solving Linear/Nonlinear Elliptic Problems on Sequential and Parallel Computers★ A Parallel Multilevel Semi-implicit Scheme of Fluid Modeling for Numerical Low-Temperature Plasma Simulation
★ Performance Comparison of Two PETSc-based Eigensolvers for Quadratic PDE Problems★ A Parallel Two-level Polynomial Jacobi-Davidson Algorithm for Large Sparse Dissipative Acoustic Eigenvalue Problems
★ A Full Space Lagrange-Newton-Krylov Algorithm for Minimum Time Trajectory Optimization★ Parallel Two-level Patient-specific Numerical Simulation of Three-dimensional Rheological Blood Flows in Branching Arteries
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2026-7-31以後開放)
摘要(中) 本論文探討無約束最佳化問題及其特殊形式,如非線性最小平方問題和非線性對稱方程求解,這些問題在數值最佳化中扮演重要角色,在現實世界中有許多應用。

為了解決這些最佳化挑戰,存在多種方法。本論文研究了一組擬牛頓法,這是一類用於解決這些問題的迭代技術,並展示了它們相較於傳統最佳化方法如最速下降法和牛頓法的優勢。擬牛頓法僅利用梯度評估來逼近包含目標函數二階信息的Hessian矩陣。這種逼近顯著減少了計算成本和複雜性,使擬牛頓法對於計算精確Hessian矩陣不可行的大規模問題尤為吸引。

使用類割線的對角矩陣近似,擬牛頓法為各種優化問題提供了高效的解決方案,展示了它們在不同情境下的有效性。這些方法在適當條件下還具有全局收斂性。
摘要(英) This thesis aims to develop an efficient solution algorithm for the unconstrained optimization problem and its special cases, including nonlinear least-squares and nonlinear equations, which hold substantial importance in numerical optimization with numerous practical applications. Additionally, by exploring their use in addressing the nearest correlation matrix problems through numerical experiments, this study establishes a foundation for further research and practical implementations in real-world applications.

Various methods are available to handle these computational challenges. We focus on a family of quasi-Newton methods, a class of iterative techniques for solving these problems. We demonstrate their advantages over traditional optimization methods, such as the steepest descent and Newton′s method. Quasi-Newton methods only use gradient evaluations to approximate the Hessian matrix, which encodes second-order information about the objective function. This approximation significantly reduces computational cost and complexity, making quasi-Newton methods particularly appealing for large-sized problems where calculating the exact Hessian is impractical or impossible.

Using secant-like diagonal matrix approximations, quasi-Newton methods provide efficient solutions for various optimization problems, demonstrating their effectiveness across diverse scenarios. These methods also exhibit global convergence properties under suitable conditions.
關鍵字(中) ★ 非線
★ 問題
★ 方法
★ 牛頓法
★ 角色
關鍵字(英) ★ Newton method
★ nonlinear
★ least squares
★ matrix
★ correlation
論文目次 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Introduction of the problems . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Contribution and organization of the thesis . . . . . . . . . . . . . . . . . 3
2 Review of unconstrained optimization and algorithms . . . . . . . . . . . . 6
2.1 Definition of unconstrained optimization problem . . . . . . . . . . . . . 6
2.2 Gradient, Hessian, and Jacobian . . . . . . . . . . . . . . . . . . . . . . 7
2.3 Taylor’s theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.4 Conditions for optimality . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.5 Algorithms for unconstrained optimization problem: Linesearch-type
methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.5.1 Search direction choices . . . . . . . . . . . . . . . . . . . . . . 10
2.5.2 Stepsize selections . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.5.3 Armijo-backtracking linesearch . . . . . . . . . . . . . . . . . . 12
2.5.4 Convergence of linesearch-type methods . . . . . . . . . . . . . 12
2.5.5 Rate of convergence . . . . . . . . . . . . . . . . . . . . . . . . 14
3 A new structured quasi-Newton method for nonlinear least squares problems 16
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.2 A SQN method-based SLDA strategy . . . . . . . . . . . . . . . . . . . 18
3.3 Convergence analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.4 Numerical results and discussions . . . . . . . . . . . . . . . . . . . . . 25
3.4.1 Numerical experimental setup . . . . . . . . . . . . . . . . . . . 25
3.4.2 Comparison metric: Performance profiles . . . . . . . . . . . . . 27
3.4.3 Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4 Nonmonotone quasi-Newton method for symmetric nonlinear equations . . 33
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
4.2 A QN-SDAJ strategy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.3 Convergence analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
4.4 Numerical results and discussions . . . . . . . . . . . . . . . . . . . . . 44
4.4.1 Baseline methods . . . . . . . . . . . . . . . . . . . . . . . . . . 45
4.4.2 Test problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
4.4.3 Numerical experimental setup . . . . . . . . . . . . . . . . . . . 47
4.4.4 Nonmonotone vs. monotone linesearch for QN-SDAJ . . . . . . 48
4.4.5 Performance comparison of four methods . . . . . . . . . . . . . 48
5 A new quasi-Newton acceleration for the nearest correlation matrix problems 52
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
5.2 A review of dual problem-based approaches . . . . . . . . . . . . . . . . 55
5.2.1 Accelerated gradient descent method . . . . . . . . . . . . . . . 55
5.2.2 Inexact Jacobian-free semismooth inexact Newton methods . . . 56
5.2.3 A QN-SDAJ strategy finding the critical points for the nearest
correlation matrix problems . . . . . . . . . . . . . . . . . . . . 59
5.3 Proposed methods addressing dual problem: A combination of QN-SDAJ
and AGD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
5.4 Numerical experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
5.4.1 Solving the primal nearest correlation matrix problem by alternat-
ing projection method . . . . . . . . . . . . . . . . . . . . . . . 62
5.4.2 Numerical experimental setup . . . . . . . . . . . . . . . . . . . 63
5.4.3 Comparison of the primal and dual approaches in terms of the
norm of errors . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
5.4.4 AGD-SDAJ and the other dual-approach solvers . . . . . . . . . 64
5.4.5 Overall performance . . . . . . . . . . . . . . . . . . . . . . . . 65
6 Conclusions and future work . . . . . . . . . . . . . . . . . . . . . . . . . . 69
參考文獻 [1] A. N. Akansu and M. U. Torun. Toeplitz approximation to empirical correlation
matrix of asset returns: A signal processing perspective. IEEE J. Sel. Top. Signal
Process, 6:319–326, 2012.
[2] N. Andrei. An acceleration of gradient descent algorithm with backtracking for
unconstrained optimization. Numer. Algorithms, 42:63–73, 2006.
[3] N. Andrei. An unconstrained optimization test functions collection. Adv. Model.
Optim., 10:147–161, 2008.
[4] N. Andrei. A diagonal quasi-Newton updating method for unconstrained optimiza-
tion. Numer Algorithms, 81:575–590, 2019.
[5] N. Andrei. A new accelerated diagonal quasi-Newton updating method with scaled
forward finite differences directional derivative for unconstrained optimization.
Optim., 70:345–360, 2021.
[6] T. Barz, S. K ̈orkel, and G. Wozny. Nonlinear ill-posed problem analysis in model-
based parameter estimation and experimental design. Comput. Chem. Eng., 77:24–
42, 2015.
[7] S. Bellavia and B. Morini. A globally convergent Newton-GMRES subspace method
for systems of nonlinear equations. SIAM J. Sci. Comput., 23:940–960, 2001.
[8] R. Borsdorf. A Newton Algorithm for the Nearest Correlation Matrix. Master’s
thesis, University of Manchester, 2007.
[9] R. Borsdorf, N. J. Higham, and M. Raydan. Computing a nearest correlation matrix
with factor structure. SIAM J. Matrix Anal. Appl., 31:2603–2622, 2010.
[10] R. Borsdorf and N.J. Higham. A preconditioned Newton algorithm for the nearest
correlation matrix. IMA J. Numer. Anal., 30:94–107, 2010.
[11] J. P. Boyd. A spectrally accurate quadrature for resolving the logarithmic endpoint
singularities of the chandrasekhar H-function. J. Quant. Spectrosc. Radiat. Transf.,
94:467–475, 2005.
[12] S. Boyd and L. Vandenberghe. Introduction to Applied Linear Algebra: Vectors,
Matrices, and Least Squares. Cambridge University Press, Cambridge, 2018.
[13] W. L. Briggs, V. E. Henson, and S. F. McCormick. A Multigrid Tutorial. SIAM,
Philadelphia, 2000.
[14] P. R. Brune, M. G. Knepley, B. F. Smith, and X. M. Tu. Composing scalable
nonlinear algebraic solvers. SIAM Rev., 57:535–565, 2015.
[15] S. Buyruko ̆glu and A. Akbas ̧ . Machine learning based early prediction of type 2
diabetes: a new hybrid feature selection approach using correlation matrix with
heatmap and SFS. BAJECE, 10:110–117, 2022.
[16] R.H. Byrd and J. Nocedal. A tool for the analysis of quasi-Newton methods with
application to unconstrained minimization. SIAM J. Numer. Anal., 26:727–739,
1989.
[17] S. R. Cai and F. N. Hwang. A hybrid-line-and-curve search globalization technique
for inexact Newton methods. Appl. Numer. Math., 173:79–93, 2022.
[18] X. C. Cai and D. E. Keyes. Nonlinearly preconditioned inexact Newton algorithms.
SIAM J. Sci. Comput., 24:183–200, 2002.
[19] W. Y. Cheng and Z. X. Chen. Nonmonotone spectral method for large-scale sym-
metric nonlinear equations. Numer. Algorithms, 62:149–162, 2013.
[20] A. Cornelio. Regularized nonlinear least squares methods for hit position recon-
struction in small gamma cameras. Appl. Math. Comput., 217:5589–5595, 2011.
[21] J. E. Dennis, H. J. Martinez, and R. A. Tapia. Convergence theory for the structured
BFGS secant method with an application to nonlinear least squares. J. Optim.
Theory App., 61:161–178, 1989.
[22] J. E. Dennis and J. J. Mor ́e. A characterization of superlinear convergence and its
application to quasi-Newton methods. Math. Comput., 28:549–560, 1974.
[23] J. E. Dennis Jr. Some computational techniques for the nonlinear least squares
problem. In Numerical solution of systems of nonlinear algebraic equations, pages
157–183. Academics, 1973.
[24] J. E. Dennis Jr and R. B. Schnabel. Numerical Methods for Unconstrained Opti-
mization and Nonlinear Equations. SIAM, Philadelphia, 1996.
[25] F. Deutsch. The method of alternating orthogonal projections. In Approximation
theory, spline functions and applications, pages 105–121. Springer, 1992.
[26] E. D. Dolan and J. J. Mor ́e. Benchmarking optimization software with performance
profiles. Math. Program., 91:201–213, 2002.
[27] A. E. Duran-Pinedo, B. Paster, R. Teles, and J. Frias-Lopez. Correlation network
analysis applied to complex biofilm communities. PLOS ONE, 6:e28438, 2011.
[28] A. Dutta, E. H. Bergou, Y. M. Xiao, M. Canini, and P. Richt ́arik. Direct nonlinear
acceleration. EURO J. Comput. Optim., 10:100047, 2022.
[29] R. L. Dykstra. An algorithm for restricted least squares regression. J. Am. Stat.
Assoc., 78:837–842, 1983.
[30] P. Embrechts, A. McNeil, and D. Straumann. Correlation and dependence in risk
management: properties and pitfalls. Risk Manag.: VaR Beyond, 1:176–223, 2002.
[31] D. Fan, S. Li, X. Li, J. Yang, and X. Wan. Seafloor topography estimation from
gravity anomaly and vertical gravity gradient using nonlinear iterative least square
method. Remote Sens., 13:64, 2020.
[32] A. I. Fedoseyev, M. J. Friedman, and E. J. Kansa. Continuation for nonlinear elliptic
partial differential equations discretized by the multiquadric method. Int. J. Bifurcat.
Chaos, 10:481–492, 2000.
[33] R. Fletcher and C. Xu. Hybrid methods for nonlinear least squares. IMA J. Numer.
Anal., 7:371–389, 1987.
[34] D. I. Georgescu, N. J. Higham, and G. W. Peters. Explicit solutions to correlation
matrix completion problems, with an application to risk management and insurance.
R. Soc. Open Sci., 5:172348, 2018.
[35] G. Golub and V. Pereyra. Separable nonlinear least squares: the variable projection
method and its applications. Inverse Probl., 19:R1, 2003.
[36] M. A. Gomes-Ruggiero, J. M. Mart ́ınez, and A. C. Moretti. Comparing algorithms
for solving sparse nonlinear systems of equations. SIAM J. Sci. Statist. Comput.,
13:459–483, 1992.
[37] S. Gratton, A.S. Lawless, and N. K. Nichols. Approximate Gauss–Newton methods
for nonlinear least squares problems. SIAM J. Optim., 18:106–132, 2007.
[38] L. Grippo, F. Lampariello, and S. Lucidi. A nonmonotone line search technique for
Newton’s method. SIAM J. Numer. Anal., 23:707–716, 1986.
[39] G. Z. Gu, D. H. Li, L. Qi, and S. Z. Zhou. Descent directions of quasi-Newton
methods for symmetric nonlinear equations. SIAM J. Numer. Anal., 40:1763–1774,
2002.
[40] P. C. Hansen, V. Pereyra, and G. Scherer. Least Squares Data Fitting with Applica-
tions. John Hopkins University Press, Baltimore, 2013.
[41] H. O. Hartley. The modified Gauss-Newton method for the fitting of non-linear
regression functions by least squares. Technometrics, 3:269–280, 1961.
[42] S. Henn. A Levenberg–Marquardt scheme for nonlinear image registration. BIT
Numer. Math., 43:743–759, 2003.
[43] N. J. Higham and N. Strabi ́c. Anderson acceleration of the alternating projections
method for computing the nearest correlation matrix. Numer. Algorithms, 72:1021–
1042, 2016.
[44] N.J. Higham. Computing the nearest correlation matrix—a problem from finance.
IMA J. Numer. Anal., 22:329–343, 2002.
[45] J. Huschens. On the use of product structure in secant methods for nonlinear least
squares problems. SIAM J. Optim., 4:108–129, 1994.
[46] D. Q. Huynh and F. N. Hwang. An accelerated structured quasi-Newton method
with a diagonal second-order Hessian approximation for nonlinear least squares
problems. J. Comput. Appl. Math., 442:115718, 2024.
[47] F. N. Hwang, Y. C. Su, and X. C. Cai. A parallel adaptive nonlinear elimination
preconditioned inexact Newton method for transonic full potential equation. Comput.
Fluids, 110:96–107, 2015.
[48] C. T. Kelley. Solution of the Chandrasekhar H-equation by Newton’s method. J.
Math. Phys., 21:1625–1628, 1980.
[49] C.T. Kelley. Iterative Methods for Optimization. SIAM, Philadelphia, 1999.
[50] D. Kincaid, D. R. Kincaid, and E. W. Cheney. Numerical analysis: Mathematics of
Scientific Computing, volume 2. American Mathematical Soc., 2009.
[51] M. Kommenda, B. Burlacu, G. Kronberger, and M. Affenzeller. Parameter identifi-
cation for symbolic regression using nonlinear least squares. Genet. Program Evol.
M., 21:471–501, 2020.
[52] S. Kumar and N. Deo. Correlation and network analysis of global financial indices.
Phys, Rev. E, 86:026101, 2012.
[53] D. Lee and H. S. Seung. Algorithms for non-negative matrix factorization. Adv.
Neur. Inf. Process. Syst., 13, 2000.
[54] Y. C. Lee, G. Doolen, H. H. Chen, G. Z. Sun, T. Maxwell, and H. Y. Lee. Machine
learning using a higher order correlation network. Physica, 22D:276–306, 1986.
[55] W. J. Leong, M. A. Hassan, and M. Y. Waziri. A matrix-free quasi-Newton method
for solving large-scale nonlinear systems. Comput. Math. Appl., 62:2354–2363,
2011.
[56] K. Levenberg. A method for the solution of certain non-linear problems in least
squares. Q. Appl. Math., 2:164–168, 1944.
[57] D. H. Li and M. Fukushima. A modified BFGS method and its global convergence
in nonconvex minimization. Comput. Appl. Math., 129:15–35, 2001.
[58] D. H. Li and M. S. Fukushima. A globally and superlinearly convergent Gauss–
Newton-based BFGS method for symmetric nonlinear equations. SIAM J. Numer.
Anal., 37:152–172, 1999.
[59] D.H. Li and X.L. Wang. A modified Fletcher-Reeves-type derivative-free method
for symmetric nonlinear equations. Numer. Algebra Control. Optim., 1:71, 2011.
[60] Q. N. Li, H. D. Qi, and N. H. Xiu. Block relaxation and majorization methods for the
nearest correlation matrix with factor structure. Comput. Optim. Appl., 50:327–349,
2011.
[61] X. R. Li, X. L. Wang, and X. B. Duan. A limited memory BFGS method for solving
large-scale symmetric nonlinear equations. In Abstr. Appl. Anal., volume 2014.
Hindawi, 2014.
[62] J. K. Liu and Y. M. Feng. A norm descent derivative-free algorithm for solving
large-scale nonlinear symmetric equations. J. Comput. Appl. Math., 344:89–99,
2018.
[63] L. Luk ˇsan. Hybrid methods for large sparse nonlinear least squares. J. Optim.
Theory Appl., 89:575–595, 1996.
[64] L. Luk ˇsan, C. Matonoha, and J. Vlcek. Problems for nonlinear least squares and
nonlinear equations. Technical report, Institute of Computer Science, Academy of
Sciences of the Czech Republic, 2018.
[65] L. Luk ˇsan, C. Matonoha, and J. Vlcek. Hybrid methods for nonlinear least squares
problems. ́Ustav informatiky, Pod vod ́arenskou vˇeˇz ́ı, 2:07, 2019.
[66] D. W. Marquardt. An algorithm for least-squares estimation of nonlinear parameters.
J. Soci. Indust. Appl. Math., 11:431–441, 1963.
[67] H. Mohammad, M.Y. Wazari, and S. A. Santos. A brief survey of methods for
solving nonlinear least-squares problems. Numer. Algebr., Contr. Optim., 9:1, 2019.
[68] J. J. Mor ́e. The Levenberg-Marquardt algorithm: implementation and theory. In
Numerical Analysis, pages 105–116. Springer, 1978.
[69] J. J. Mor ́e, B. S. Garbow, and K. E. Hillstrom. Testing unconstrained optimization
software. ACM Trans. Math. Software, 7:17–41, 1981.
[70] B. Morini, M. Porcelli, and P. L. Toint. Approximate norm descent methods for
constrained nonlinear systems. Math. Comput., 87:1327–1351, 2018.
[71] J. L. Nazareth. An adaptive method for minimizing a sum of squares of nonlinear
functions. Preprint, 1983.
[72] Y. Nesterov. A method of solving a convex programming problem with convergence
rate o(1/k2). Doklady Akademii Nauk SSSR, 269:543, 1983.
[73] J. Nocedal and S. J. Wright. Numerical Optimization. Springer, Berlin, 1999.
[74] H. Ogasawara. A note on the equivalence of a class of factorized Broyden families
for nonlinear least squares problems. Linear Algebra App., 297:183–191, 1999.
[75] A. Papini, M. Porcelli, and C. Sgattoni. On the global convergence of a new spectral
residual algorithm for nonlinear systems of equations. Boll. Unione Mat. Ital.,
14:367–378, 2021.
[76] H. D. Qi and D. F. Sun. A quadratically convergent Newton method for computing
the nearest correlation matrix. SIAM J. Matrix Anal. Appl., 28:360–385, 2006.
[77] A. N. Riseth. Objective acceleration for unconstrained optimization. Numer. Linear
Algebra Appl., 26:e2216, 2019.
[78] R. T. Rockafellar and R. J. B. Wets. Variational Analysis. Springer Science &
Business Media, Washington, 2009.
[79] R. Scitovski. A special nonlinear least-squares problem. J. Comput. Appl. Math.,
53:323–331, 1994.
[80] S. B. Sheng and Z. H. Zou. A new secant method for nonlinear least squares
problems. Numer. Math. J. Chin. Univ., 2:125137, 1993.
[81] Z. J. Shi. Convergence of line search methods for unconstrained optimization. Appl.
Math. Comput., 157:393–405, 2004.
[82] A. Sofi, M. Mamat, S. Z. Mohid, M. A. H. Ibrahim, and N. Khalid. Performance
profile comparison using Matlab. In Proceedings of International Conference on
Information Technology & Society, 2015.
[83] H. D. Sterck. Steepest descent preconditioning for nonlinear GMRES optimization.
Numer. Linear Algebra Appl., 20:453–471, 2013.
[84] C. C. Took, S. C. Douglas, and D. P. Mandic. On approximate diagonalization of
correlation matrices in widely linear signal processing. IEEE Trans. Signal Process.,
60:1469–1473, 2011.
[85] M. K. Transtrum and J. P. Sethna. Improvements to the Levenberg-Marquardt
algorithm for nonlinear least-squares minimization. arXiv preprint arXiv:1201.5885,
2012.
[86] M. Y. Waziri, W. J. Leong, M. A. Hassan, and M. Monsi. A new Newton’s method
with diagonal Jacobian approximation for systems of nonlinear equations. J. Math.
Stat., 6:246–252, 2010.
[87] H. Yabe and N. Yamaki. Convergence of a factorized Broyden-like family for
nonlinear least squares problems. SIAM J. Optim., 5:770–791, 1995.
[88] H. J. Yang and F. N. Hwang. An adaptive nonlinear elimination preconditioned
inexact Newton algorithm for highly local nonlinear multicomponent PDE systems.
Appl. Numer. Math, 133:100–115, 2018.
[89] J. F. Yin and Y. M. Huang. Modified multiplicative update algorithms for computing
the nearest correlation matrix. Int. J. Appl. Math. Informat., 30:201–210, 2012.
[90] J. F. Yin and Y. Zhang. Alternative gradient algorithms for computing the nearest
correlation matrix. Appl. Math. Comput., 219:7591–7599, 2013.
[91] G. Yuan and S. Yao. A BFGS algorithm for solving symmetric nonlinear equations.
Optim., 62:85–99, 2013.
[92] G. L. Yuan and X. R. Li. A rank-one fitting method for solving symmetric nonlinear
equations. J. Appl. Funct. Anal., 5:389–407, 2010.
[93] G. L. Yuan and X. W. Lu. A new backtracking inexact BFGS method for symmetric
nonlinear equations. Comput. Math. Appl., 55:116–129, 2008.
[94] G. L. Yuan, X. W. Lu, and Z. X. Wei. BFGS trust-region method for symmetric
nonlinear equations. J. Comput. Appl. Math., 230:44–58, 2009.
[95] G. L. Yuan, S. D. Meng, and Z. X. Wei. A trust-region-based BFGS method with line
search technique for symmetric nonlinear equations. Adv. Oper. Res., 2009:909753,
2009.
[96] Y. X. Yuan. Recent advances in numerical methods for nonlinear equations and
nonlinear least squares. Numer. Algebr. Contr. Optim., 1:15–34, 2011.
[97] N. Zhan and J. R. Kitchin. Uncertainty quantification in machine learning and
nonlinear least squares regression models. AIChE J., 68:e17516, 2022.
[98] J. Z. Zhang, L. H. Chen, and N. Y. Deng. A family of scaled factorized Broyden-like
methods for nonlinear least squares problems. SIAM J. Optim., 10:1163–1179,
2000.
[99] Q. B. Zhao, L. Q. Zhang, and A. Cichocki. Multilinear and nonlinear generalizations
of partial least squares: an overview of recent advances. Wiley Interdiscip. Rev. Data
Min. Knowl. Discov., 4:104–115, 2014.
[100] W. Zhou. On the convergence of the modified Levenberg–Marquardt method with
a nonmonotone second order Armijo type line search. J. Comput. Appl. Math.,
239:152–161, 2013.
[101] W. Zhou and X. Chen. Global convergence of a new hybrid Gauss–Newton struc-
tured BFGS method for nonlinear least squares problems. SIAM J. Optim., 20:2422–
2441, 2010.
[102] W. J. Zhou. A modified BFGS type quasi-Newton method with line search for
symmetric nonlinear equations problems. J. Comput. Appl. Math., 367:112454,
2020.
[103] W. J. Zhou. A globally convergent BFGS method for symmetric nonlinear equations.
J. Ind. Manag. Optim., 18:1295, 2022.
[104] W. J. Zhou and D. H. Li. A globally convergent BFGS method for nonlinear
monotone equations without any merit functions. Math. Comput,, 77:2231–2240,
2008.
[105] W. J. Zhou and D. M. Shen. An inexact PRP conjugate gradient method for
symmetric nonlinear equations. Numer. Funct. Anal. Optim., 35:370–388, 2014.
[106] W. J. Zhou and D. M. Shen. Convergence properties of an iterative method for
solving symmetric nonlinear equations. J. Optim. Theory Appl., 164:277–289, 2015.
[107] X. J. Zhu. A feasible filter method for the nearest low-rank correlation matrix
problem. Numer. Algorithms, 69:763–784, 2015.
指導教授 黃楓南(Hwang Feng Nan) 審核日期 2024-7-23
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明