Practical Methods of Optimization
정보
이 질문은 마감되었습니다. 편집하거나 답변을 올리려면 질문을 다시 여십시오.
이전 댓글 표시
Could anyone share their MATLAB codes or best practices for these specific methods? I am particularly interested in how you handle the "narrow valley" convergence issue in the Coordinate Search method.
Codes for metods of optimisation
One-Dimensional Methods
- Fibonacci Search Method
- Golden Section Search
- Dichotomous Search
- Newton’s Method
- Secant Method
- Quadratic Interpolation Method
Multi-Dimensional Methods
- Univariate Method
- Hooke-Jeeves Pattern Search
- Nelder-Mead Simplex Method
- Rosenbrock Method
- Powell’s Conjugate Direction Method
Gradient-Based Methods
One-Dimensional Gradient:
- Steepest Descent Line Search
- Newton-Raphson Method
Multi-Dimensional Gradient:
- Steepest Descent Method
- Fletcher-Reeves Conjugate Gradient Method
- Polak-Ribière Conjugate Gradient Method
- Newton’s Method in Optimization
- Davidon-Fletcher-Powell Method
- Broyden-Fletcher-Goldfarb-Shanno Method
- Sequential Quadratic Programming
댓글 수: 10
Mark
2026년 5월 2일 12:01
Star Strider
2026년 5월 2일 12:26
Gradinent descent methods are extremely sensitive to the initial parameter estimate selection, and can get trapped in local minima. The more robust global methods search the entire parameter space for the best options. The gradient search methods can then 'fine tune' the initial results.
Mark
2026년 5월 3일 10:00
A large number of freely available codes on optimization are listed here:
Or you could visit File Exchange:
Mark
2026년 5월 3일 13:37
답변 (0개)
이 질문은 마감되었습니다.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!