使用類割線的對角矩陣近似,擬牛頓法為各種優化問題提供了高效的解決方案,展示了它們在不同情境下的有效性。這些方法在適當條件下還具有全局收斂性。;This thesis aims to develop an efficient solution algorithm for the unconstrained optimization problem and its special cases, including nonlinear least-squares and nonlinear equations, which hold substantial importance in numerical optimization with numerous practical applications. Additionally, by exploring their use in addressing the nearest correlation matrix problems through numerical experiments, this study establishes a foundation for further research and practical implementations in real-world applications.
Various methods are available to handle these computational challenges. We focus on a family of quasi-Newton methods, a class of iterative techniques for solving these problems. We demonstrate their advantages over traditional optimization methods, such as the steepest descent and Newton′s method. Quasi-Newton methods only use gradient evaluations to approximate the Hessian matrix, which encodes second-order information about the objective function. This approximation significantly reduces computational cost and complexity, making quasi-Newton methods particularly appealing for large-sized problems where calculating the exact Hessian is impractical or impossible.
Using secant-like diagonal matrix approximations, quasi-Newton methods provide efficient solutions for various optimization problems, demonstrating their effectiveness across diverse scenarios. These methods also exhibit global convergence properties under suitable conditions.