alternatives to gradient-based optimization algorithms like lsqnonlin that handle at least O(10) parameters?

조회 수: 18 (최근 30일)
I am fitting parameters of a PDE to experimental data utilizing the function lsqnonlin from the optimization toolbox.
I observed that I can successfully optimize 10,...,20 parameters but not more. My objective is, however, to optimize 50 to 100 parameters.
Based on my experience with lsqnonlin (and fmincon and friends,...), it seems that this class of optimization algorithms can handle a small number of parameters (10,...,20) well, but are not appropriate anymore if there are many more parameters.
Fast convergence or computation time are not important to me.
I am coming from an engineering background and do not have deep knowledge about other class of optimization algorithms. That said, I would appreciate if someone could give me some keywords or recommendations regarding alternative optimization algorithms, that are designed for handling a larger number of parameters.
Thank you!
  댓글 수: 18
Matt J
Matt J 2023년 2월 26일
편집: Matt J 2023년 2월 26일
You could do that, or even have c(i) = (p(i-1)-2p(i)+p(i-1))^200. However, with larger exponents, the gradient of the cost function become small over a wider neighborhood of c=0. Smaller gradients means slower convergence, and also the chance that the OptimalityTolerance would trigger prematurely. Conversely, with smaller exponents (but stil greater than 1), the gradient becomes less smooth. So, there is a trade-off to be reckoned with.
SA-W
SA-W 2023년 3월 6일
편집: SA-W 2023년 3월 6일
I implemented
r=[weight1*r; weight2*constraint_violation];
f = ||r||^2
iwith lsqnonlin and figured out that
weight_2 = 1e4 %approximately
is necessary that the ineqaulity constraints are considered at all at intermediate iterations and weight_2 below or above 1e4, respectively, will not include sufficiently the constraint_violation or dominates the residual r.
If I implement the above inequality constraints in fmincon and multiply A*x<=b by 1e4 (this multiplication is necessary for fmincon to pay attention to the constraints), fmincon gives a better solution than lsqnonlin. By better, I mean that the solution is nearly perfect equal to the exact solution. The lsqnonlin solution fulfills the constraints in A, however, the solution is not as accurate.
Anyway, fmincon requires twice or three times as much iterations as lsqnonlin requires to find the solution. That said, I would like to stick lsqnonlin because the evaluation of the objective function is quite expensive in my case.
I read the documentation on the iterior-point algorithm. The way fmincon incorporates the constraints is based on a merit function, which is, of course, different than just expanding the residual vector as I did it.
Admittedly, I do not have enough background in optimization. Can you give me another recommendation (which is closer to the way fmincon incorporates the constraints) to implement my constraints in lsqnonlin?
Thank you!

댓글을 달려면 로그인하십시오.

채택된 답변

Matt J
Matt J 2023년 2월 24일
편집: Matt J 2023년 2월 24일
it seems that this class of optimization algorithms can handle a small number of parameters (10,...,20) well, but are not appropriate anymore if there are many more parameters.
I am currently using lsqnonlin in optimization problems with 11796480 variables. It converges fine, though in my case the line search operations really do make it much slower than I would like. In any case, 20 variables is definitely not the upper limit.
I am fitting parameters of a PDE to experimental data utilizing the function lsqnonlin from the optimization toolbox.
Make sure you are following the guidelines here,
  댓글 수: 9
SA-W
SA-W 2023년 3월 1일
If someone comes across our disussion here: the algorithms in lsqnonlin can also recover from NaN.

댓글을 달려면 로그인하십시오.

추가 답변 (1개)

Matt J
Matt J 2023년 2월 26일
편집: Matt J 2023년 2월 26일
The parameters that I am optimizing (the function values) have to represent a convex function in 1d.
Another strategy that I would suggest is to first fit a constrained curve to your function samples using this FEX download,
This tool will allow you both to constrain the fit to be convex (using the 'ConcaveUp' option), and also to compute the derivates of the fit. Once you have the curve fit and its derivates, you can substitute them into your ODE to obtain conventional equations in the original set of unknown parameters. You can then solve these equations using lsqnonin, or even a linear solver, depending on the form of your equations.
  댓글 수: 7
Matt J
Matt J 2023년 3월 6일
I don't know enough about finite element analysis to know why you cannot fit a spline to a finite element field.
SA-W
SA-W 2023년 3월 6일
Would you support me further regarding the issue in my new comment (in the comments under the question)?

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Solver Outputs and Iterative Display에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by