필터 지우기
필터 지우기

Issue with large memory required for non-linear optimizer

조회 수: 1 (최근 30일)
Yannis Stamatiou
Yannis Stamatiou 2023년 4월 18일
댓글: Yannis Stamatiou 2023년 4월 20일
Dear Matlab community hi.
I tried to run the following optimization problem for a 2-dimensional optimization variable of size 150x150. For some reason, the system creates somehow in the optimization process (I guess) some matrix of size (150^2)x(150^2). I tried to solve the issue for several days now (with the different options shown in comments) but I cannot understand why MATLAB creates such a huge matrix in the solution process. Is there, perhaps, some other nonlinear optimizer in MATLAB that does not require such huge matrices? Any help on this issue would be very helpful.
With best wishes,
Yannis
a = 4;
b = 2.1;
c = 4;
x = optimvar('x',150,150);
prob = optimproblem;
prob.Objective = parameterfun(x,a,b,c);
%opts=optimoptions('fmincon','Algorithm','interior-point','SpecifyObjectiveGradient',true,'HessianFcn','objective');
%opts=optimoptions('quadprog','Algorithm','trust-region-reflective','Display','off');
opts = optimoptions('fminunc','Algorithm','trust-region');
opts.HessianApproximation = 'lbfgs';
opts.SpecifyObjectiveGradient = false;
x0.x = 0.5 * ones([150,150]);
%[sol,qfval,qexitflag,qoutput] = solve(prob,x0,'options',opts);
[sol,fval] = solve(prob,x0)
  댓글 수: 3
Yannis Stamatiou
Yannis Stamatiou 2023년 4월 20일
이동: John D'Errico 2023년 4월 20일
Hi,
thank you all for your replies!
With best wishes,
Yannis
Torsten
Torsten 2023년 4월 20일
이동: John D'Errico 2023년 4월 20일
And what did you decide to do ?

댓글을 달려면 로그인하십시오.

채택된 답변

Alan Weiss
Alan Weiss 2023년 4월 19일
You have 150^2 optimization variables. I do not see your parameterfun function, but if it is not a supported function for automatic differentiation, then fminunc cannot use the 'trust-region' algorithm because that algorithm requires a gradient function. The LBFGS Hessian approximation is not supported in the 'quasi-newton' algorithm. Sorry.
Alan Weiss
MATLAB mathematical toolbox documentation
  댓글 수: 6
Bruno Luong
Bruno Luong 2023년 4월 20일
@Yannis Stamatiou " I cannot figure out exactly why"
The lbfgs formula approximate the inverse of the Hessian by low-rank approximation and does not require to store the full Hessian or its inverse.
That's why the memory requirement is reduced and it is suitable for lare-scale problem.
Yannis Stamatiou
Yannis Stamatiou 2023년 4월 20일
Hi,
I now understand why it worked. In fact it is not very easy to see what is the best approach for large optimization problems, I was a bit lost in the documentation with the several optimization options, algorithms and settings. However, all the replies to my post, and yours in particular with respect to why the method worked, were very helpful, thanks!
Yannis

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Solver Outputs and Iterative Display에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by