參考文獻 |
[1] A. N. Akansu and M. U. Torun. Toeplitz approximation to empirical correlation
matrix of asset returns: A signal processing perspective. IEEE J. Sel. Top. Signal
Process, 6:319–326, 2012.
[2] N. Andrei. An acceleration of gradient descent algorithm with backtracking for
unconstrained optimization. Numer. Algorithms, 42:63–73, 2006.
[3] N. Andrei. An unconstrained optimization test functions collection. Adv. Model.
Optim., 10:147–161, 2008.
[4] N. Andrei. A diagonal quasi-Newton updating method for unconstrained optimiza-
tion. Numer Algorithms, 81:575–590, 2019.
[5] N. Andrei. A new accelerated diagonal quasi-Newton updating method with scaled
forward finite differences directional derivative for unconstrained optimization.
Optim., 70:345–360, 2021.
[6] T. Barz, S. K ̈orkel, and G. Wozny. Nonlinear ill-posed problem analysis in model-
based parameter estimation and experimental design. Comput. Chem. Eng., 77:24–
42, 2015.
[7] S. Bellavia and B. Morini. A globally convergent Newton-GMRES subspace method
for systems of nonlinear equations. SIAM J. Sci. Comput., 23:940–960, 2001.
[8] R. Borsdorf. A Newton Algorithm for the Nearest Correlation Matrix. Master’s
thesis, University of Manchester, 2007.
[9] R. Borsdorf, N. J. Higham, and M. Raydan. Computing a nearest correlation matrix
with factor structure. SIAM J. Matrix Anal. Appl., 31:2603–2622, 2010.
[10] R. Borsdorf and N.J. Higham. A preconditioned Newton algorithm for the nearest
correlation matrix. IMA J. Numer. Anal., 30:94–107, 2010.
[11] J. P. Boyd. A spectrally accurate quadrature for resolving the logarithmic endpoint
singularities of the chandrasekhar H-function. J. Quant. Spectrosc. Radiat. Transf.,
94:467–475, 2005.
[12] S. Boyd and L. Vandenberghe. Introduction to Applied Linear Algebra: Vectors,
Matrices, and Least Squares. Cambridge University Press, Cambridge, 2018.
[13] W. L. Briggs, V. E. Henson, and S. F. McCormick. A Multigrid Tutorial. SIAM,
Philadelphia, 2000.
[14] P. R. Brune, M. G. Knepley, B. F. Smith, and X. M. Tu. Composing scalable
nonlinear algebraic solvers. SIAM Rev., 57:535–565, 2015.
[15] S. Buyruko ̆glu and A. Akbas ̧ . Machine learning based early prediction of type 2
diabetes: a new hybrid feature selection approach using correlation matrix with
heatmap and SFS. BAJECE, 10:110–117, 2022.
[16] R.H. Byrd and J. Nocedal. A tool for the analysis of quasi-Newton methods with
application to unconstrained minimization. SIAM J. Numer. Anal., 26:727–739,
1989.
[17] S. R. Cai and F. N. Hwang. A hybrid-line-and-curve search globalization technique
for inexact Newton methods. Appl. Numer. Math., 173:79–93, 2022.
[18] X. C. Cai and D. E. Keyes. Nonlinearly preconditioned inexact Newton algorithms.
SIAM J. Sci. Comput., 24:183–200, 2002.
[19] W. Y. Cheng and Z. X. Chen. Nonmonotone spectral method for large-scale sym-
metric nonlinear equations. Numer. Algorithms, 62:149–162, 2013.
[20] A. Cornelio. Regularized nonlinear least squares methods for hit position recon-
struction in small gamma cameras. Appl. Math. Comput., 217:5589–5595, 2011.
[21] J. E. Dennis, H. J. Martinez, and R. A. Tapia. Convergence theory for the structured
BFGS secant method with an application to nonlinear least squares. J. Optim.
Theory App., 61:161–178, 1989.
[22] J. E. Dennis and J. J. Mor ́e. A characterization of superlinear convergence and its
application to quasi-Newton methods. Math. Comput., 28:549–560, 1974.
[23] J. E. Dennis Jr. Some computational techniques for the nonlinear least squares
problem. In Numerical solution of systems of nonlinear algebraic equations, pages
157–183. Academics, 1973.
[24] J. E. Dennis Jr and R. B. Schnabel. Numerical Methods for Unconstrained Opti-
mization and Nonlinear Equations. SIAM, Philadelphia, 1996.
[25] F. Deutsch. The method of alternating orthogonal projections. In Approximation
theory, spline functions and applications, pages 105–121. Springer, 1992.
[26] E. D. Dolan and J. J. Mor ́e. Benchmarking optimization software with performance
profiles. Math. Program., 91:201–213, 2002.
[27] A. E. Duran-Pinedo, B. Paster, R. Teles, and J. Frias-Lopez. Correlation network
analysis applied to complex biofilm communities. PLOS ONE, 6:e28438, 2011.
[28] A. Dutta, E. H. Bergou, Y. M. Xiao, M. Canini, and P. Richt ́arik. Direct nonlinear
acceleration. EURO J. Comput. Optim., 10:100047, 2022.
[29] R. L. Dykstra. An algorithm for restricted least squares regression. J. Am. Stat.
Assoc., 78:837–842, 1983.
[30] P. Embrechts, A. McNeil, and D. Straumann. Correlation and dependence in risk
management: properties and pitfalls. Risk Manag.: VaR Beyond, 1:176–223, 2002.
[31] D. Fan, S. Li, X. Li, J. Yang, and X. Wan. Seafloor topography estimation from
gravity anomaly and vertical gravity gradient using nonlinear iterative least square
method. Remote Sens., 13:64, 2020.
[32] A. I. Fedoseyev, M. J. Friedman, and E. J. Kansa. Continuation for nonlinear elliptic
partial differential equations discretized by the multiquadric method. Int. J. Bifurcat.
Chaos, 10:481–492, 2000.
[33] R. Fletcher and C. Xu. Hybrid methods for nonlinear least squares. IMA J. Numer.
Anal., 7:371–389, 1987.
[34] D. I. Georgescu, N. J. Higham, and G. W. Peters. Explicit solutions to correlation
matrix completion problems, with an application to risk management and insurance.
R. Soc. Open Sci., 5:172348, 2018.
[35] G. Golub and V. Pereyra. Separable nonlinear least squares: the variable projection
method and its applications. Inverse Probl., 19:R1, 2003.
[36] M. A. Gomes-Ruggiero, J. M. Mart ́ınez, and A. C. Moretti. Comparing algorithms
for solving sparse nonlinear systems of equations. SIAM J. Sci. Statist. Comput.,
13:459–483, 1992.
[37] S. Gratton, A.S. Lawless, and N. K. Nichols. Approximate Gauss–Newton methods
for nonlinear least squares problems. SIAM J. Optim., 18:106–132, 2007.
[38] L. Grippo, F. Lampariello, and S. Lucidi. A nonmonotone line search technique for
Newton’s method. SIAM J. Numer. Anal., 23:707–716, 1986.
[39] G. Z. Gu, D. H. Li, L. Qi, and S. Z. Zhou. Descent directions of quasi-Newton
methods for symmetric nonlinear equations. SIAM J. Numer. Anal., 40:1763–1774,
2002.
[40] P. C. Hansen, V. Pereyra, and G. Scherer. Least Squares Data Fitting with Applica-
tions. John Hopkins University Press, Baltimore, 2013.
[41] H. O. Hartley. The modified Gauss-Newton method for the fitting of non-linear
regression functions by least squares. Technometrics, 3:269–280, 1961.
[42] S. Henn. A Levenberg–Marquardt scheme for nonlinear image registration. BIT
Numer. Math., 43:743–759, 2003.
[43] N. J. Higham and N. Strabi ́c. Anderson acceleration of the alternating projections
method for computing the nearest correlation matrix. Numer. Algorithms, 72:1021–
1042, 2016.
[44] N.J. Higham. Computing the nearest correlation matrix—a problem from finance.
IMA J. Numer. Anal., 22:329–343, 2002.
[45] J. Huschens. On the use of product structure in secant methods for nonlinear least
squares problems. SIAM J. Optim., 4:108–129, 1994.
[46] D. Q. Huynh and F. N. Hwang. An accelerated structured quasi-Newton method
with a diagonal second-order Hessian approximation for nonlinear least squares
problems. J. Comput. Appl. Math., 442:115718, 2024.
[47] F. N. Hwang, Y. C. Su, and X. C. Cai. A parallel adaptive nonlinear elimination
preconditioned inexact Newton method for transonic full potential equation. Comput.
Fluids, 110:96–107, 2015.
[48] C. T. Kelley. Solution of the Chandrasekhar H-equation by Newton’s method. J.
Math. Phys., 21:1625–1628, 1980.
[49] C.T. Kelley. Iterative Methods for Optimization. SIAM, Philadelphia, 1999.
[50] D. Kincaid, D. R. Kincaid, and E. W. Cheney. Numerical analysis: Mathematics of
Scientific Computing, volume 2. American Mathematical Soc., 2009.
[51] M. Kommenda, B. Burlacu, G. Kronberger, and M. Affenzeller. Parameter identifi-
cation for symbolic regression using nonlinear least squares. Genet. Program Evol.
M., 21:471–501, 2020.
[52] S. Kumar and N. Deo. Correlation and network analysis of global financial indices.
Phys, Rev. E, 86:026101, 2012.
[53] D. Lee and H. S. Seung. Algorithms for non-negative matrix factorization. Adv.
Neur. Inf. Process. Syst., 13, 2000.
[54] Y. C. Lee, G. Doolen, H. H. Chen, G. Z. Sun, T. Maxwell, and H. Y. Lee. Machine
learning using a higher order correlation network. Physica, 22D:276–306, 1986.
[55] W. J. Leong, M. A. Hassan, and M. Y. Waziri. A matrix-free quasi-Newton method
for solving large-scale nonlinear systems. Comput. Math. Appl., 62:2354–2363,
2011.
[56] K. Levenberg. A method for the solution of certain non-linear problems in least
squares. Q. Appl. Math., 2:164–168, 1944.
[57] D. H. Li and M. Fukushima. A modified BFGS method and its global convergence
in nonconvex minimization. Comput. Appl. Math., 129:15–35, 2001.
[58] D. H. Li and M. S. Fukushima. A globally and superlinearly convergent Gauss–
Newton-based BFGS method for symmetric nonlinear equations. SIAM J. Numer.
Anal., 37:152–172, 1999.
[59] D.H. Li and X.L. Wang. A modified Fletcher-Reeves-type derivative-free method
for symmetric nonlinear equations. Numer. Algebra Control. Optim., 1:71, 2011.
[60] Q. N. Li, H. D. Qi, and N. H. Xiu. Block relaxation and majorization methods for the
nearest correlation matrix with factor structure. Comput. Optim. Appl., 50:327–349,
2011.
[61] X. R. Li, X. L. Wang, and X. B. Duan. A limited memory BFGS method for solving
large-scale symmetric nonlinear equations. In Abstr. Appl. Anal., volume 2014.
Hindawi, 2014.
[62] J. K. Liu and Y. M. Feng. A norm descent derivative-free algorithm for solving
large-scale nonlinear symmetric equations. J. Comput. Appl. Math., 344:89–99,
2018.
[63] L. Luk ˇsan. Hybrid methods for large sparse nonlinear least squares. J. Optim.
Theory Appl., 89:575–595, 1996.
[64] L. Luk ˇsan, C. Matonoha, and J. Vlcek. Problems for nonlinear least squares and
nonlinear equations. Technical report, Institute of Computer Science, Academy of
Sciences of the Czech Republic, 2018.
[65] L. Luk ˇsan, C. Matonoha, and J. Vlcek. Hybrid methods for nonlinear least squares
problems. ́Ustav informatiky, Pod vod ́arenskou vˇeˇz ́ı, 2:07, 2019.
[66] D. W. Marquardt. An algorithm for least-squares estimation of nonlinear parameters.
J. Soci. Indust. Appl. Math., 11:431–441, 1963.
[67] H. Mohammad, M.Y. Wazari, and S. A. Santos. A brief survey of methods for
solving nonlinear least-squares problems. Numer. Algebr., Contr. Optim., 9:1, 2019.
[68] J. J. Mor ́e. The Levenberg-Marquardt algorithm: implementation and theory. In
Numerical Analysis, pages 105–116. Springer, 1978.
[69] J. J. Mor ́e, B. S. Garbow, and K. E. Hillstrom. Testing unconstrained optimization
software. ACM Trans. Math. Software, 7:17–41, 1981.
[70] B. Morini, M. Porcelli, and P. L. Toint. Approximate norm descent methods for
constrained nonlinear systems. Math. Comput., 87:1327–1351, 2018.
[71] J. L. Nazareth. An adaptive method for minimizing a sum of squares of nonlinear
functions. Preprint, 1983.
[72] Y. Nesterov. A method of solving a convex programming problem with convergence
rate o(1/k2). Doklady Akademii Nauk SSSR, 269:543, 1983.
[73] J. Nocedal and S. J. Wright. Numerical Optimization. Springer, Berlin, 1999.
[74] H. Ogasawara. A note on the equivalence of a class of factorized Broyden families
for nonlinear least squares problems. Linear Algebra App., 297:183–191, 1999.
[75] A. Papini, M. Porcelli, and C. Sgattoni. On the global convergence of a new spectral
residual algorithm for nonlinear systems of equations. Boll. Unione Mat. Ital.,
14:367–378, 2021.
[76] H. D. Qi and D. F. Sun. A quadratically convergent Newton method for computing
the nearest correlation matrix. SIAM J. Matrix Anal. Appl., 28:360–385, 2006.
[77] A. N. Riseth. Objective acceleration for unconstrained optimization. Numer. Linear
Algebra Appl., 26:e2216, 2019.
[78] R. T. Rockafellar and R. J. B. Wets. Variational Analysis. Springer Science &
Business Media, Washington, 2009.
[79] R. Scitovski. A special nonlinear least-squares problem. J. Comput. Appl. Math.,
53:323–331, 1994.
[80] S. B. Sheng and Z. H. Zou. A new secant method for nonlinear least squares
problems. Numer. Math. J. Chin. Univ., 2:125137, 1993.
[81] Z. J. Shi. Convergence of line search methods for unconstrained optimization. Appl.
Math. Comput., 157:393–405, 2004.
[82] A. Sofi, M. Mamat, S. Z. Mohid, M. A. H. Ibrahim, and N. Khalid. Performance
profile comparison using Matlab. In Proceedings of International Conference on
Information Technology & Society, 2015.
[83] H. D. Sterck. Steepest descent preconditioning for nonlinear GMRES optimization.
Numer. Linear Algebra Appl., 20:453–471, 2013.
[84] C. C. Took, S. C. Douglas, and D. P. Mandic. On approximate diagonalization of
correlation matrices in widely linear signal processing. IEEE Trans. Signal Process.,
60:1469–1473, 2011.
[85] M. K. Transtrum and J. P. Sethna. Improvements to the Levenberg-Marquardt
algorithm for nonlinear least-squares minimization. arXiv preprint arXiv:1201.5885,
2012.
[86] M. Y. Waziri, W. J. Leong, M. A. Hassan, and M. Monsi. A new Newton’s method
with diagonal Jacobian approximation for systems of nonlinear equations. J. Math.
Stat., 6:246–252, 2010.
[87] H. Yabe and N. Yamaki. Convergence of a factorized Broyden-like family for
nonlinear least squares problems. SIAM J. Optim., 5:770–791, 1995.
[88] H. J. Yang and F. N. Hwang. An adaptive nonlinear elimination preconditioned
inexact Newton algorithm for highly local nonlinear multicomponent PDE systems.
Appl. Numer. Math, 133:100–115, 2018.
[89] J. F. Yin and Y. M. Huang. Modified multiplicative update algorithms for computing
the nearest correlation matrix. Int. J. Appl. Math. Informat., 30:201–210, 2012.
[90] J. F. Yin and Y. Zhang. Alternative gradient algorithms for computing the nearest
correlation matrix. Appl. Math. Comput., 219:7591–7599, 2013.
[91] G. Yuan and S. Yao. A BFGS algorithm for solving symmetric nonlinear equations.
Optim., 62:85–99, 2013.
[92] G. L. Yuan and X. R. Li. A rank-one fitting method for solving symmetric nonlinear
equations. J. Appl. Funct. Anal., 5:389–407, 2010.
[93] G. L. Yuan and X. W. Lu. A new backtracking inexact BFGS method for symmetric
nonlinear equations. Comput. Math. Appl., 55:116–129, 2008.
[94] G. L. Yuan, X. W. Lu, and Z. X. Wei. BFGS trust-region method for symmetric
nonlinear equations. J. Comput. Appl. Math., 230:44–58, 2009.
[95] G. L. Yuan, S. D. Meng, and Z. X. Wei. A trust-region-based BFGS method with line
search technique for symmetric nonlinear equations. Adv. Oper. Res., 2009:909753,
2009.
[96] Y. X. Yuan. Recent advances in numerical methods for nonlinear equations and
nonlinear least squares. Numer. Algebr. Contr. Optim., 1:15–34, 2011.
[97] N. Zhan and J. R. Kitchin. Uncertainty quantification in machine learning and
nonlinear least squares regression models. AIChE J., 68:e17516, 2022.
[98] J. Z. Zhang, L. H. Chen, and N. Y. Deng. A family of scaled factorized Broyden-like
methods for nonlinear least squares problems. SIAM J. Optim., 10:1163–1179,
2000.
[99] Q. B. Zhao, L. Q. Zhang, and A. Cichocki. Multilinear and nonlinear generalizations
of partial least squares: an overview of recent advances. Wiley Interdiscip. Rev. Data
Min. Knowl. Discov., 4:104–115, 2014.
[100] W. Zhou. On the convergence of the modified Levenberg–Marquardt method with
a nonmonotone second order Armijo type line search. J. Comput. Appl. Math.,
239:152–161, 2013.
[101] W. Zhou and X. Chen. Global convergence of a new hybrid Gauss–Newton struc-
tured BFGS method for nonlinear least squares problems. SIAM J. Optim., 20:2422–
2441, 2010.
[102] W. J. Zhou. A modified BFGS type quasi-Newton method with line search for
symmetric nonlinear equations problems. J. Comput. Appl. Math., 367:112454,
2020.
[103] W. J. Zhou. A globally convergent BFGS method for symmetric nonlinear equations.
J. Ind. Manag. Optim., 18:1295, 2022.
[104] W. J. Zhou and D. H. Li. A globally convergent BFGS method for nonlinear
monotone equations without any merit functions. Math. Comput,, 77:2231–2240,
2008.
[105] W. J. Zhou and D. M. Shen. An inexact PRP conjugate gradient method for
symmetric nonlinear equations. Numer. Funct. Anal. Optim., 35:370–388, 2014.
[106] W. J. Zhou and D. M. Shen. Convergence properties of an iterative method for
solving symmetric nonlinear equations. J. Optim. Theory Appl., 164:277–289, 2015.
[107] X. J. Zhu. A feasible filter method for the nearest low-rank correlation matrix
problem. Numer. Algorithms, 69:763–784, 2015. |