Single iteration with lsqnonlin (or fsolve), only compute new X0

조회 수: 3 (최근 30일)
Sargondjani
Sargondjani 2023년 6월 29일
댓글: Sargondjani 2023년 6월 30일
I want lsqnonlin (or fsolve) to only carry out one iteration, ie. compute the new X, and then stop. No further function evaluations.
So ideally I dont even want it to compute the new values of the objective function, but I definitely do not want extra function evaluations for the jacobian or first order optimality conditions at the new guess for X.
(My question is similar to an earlier question by me:
... but now function evaluations are even more expensive, and i want to use lsqnonlin, so i also dont know how to update X (which is easy for the Newton Raphson step if you know the Jacobian), so the suggestions made there dont help me for this case.
  댓글 수: 5
Torsten
Torsten 2023년 6월 29일
But you loose all information about the Jacobian in the iteration point and it will take much more effort in recomputing it in the next call to lsqnonlin than to continue the iterations.
Sargondjani
Sargondjani 2023년 6월 29일
@Torsten i use projection methods, and i want to update the grid before making a next iteration. so all information gathered at the new X with the old grid could be useless (especially if the step in X is relatively large). I first want to update the grid, and then do any further evaluations.
May I conclude it is not possible? Or at least not with a simple command?

댓글을 달려면 로그인하십시오.

채택된 답변

Matt J
Matt J 2023년 6월 30일
편집: Matt J 2023년 6월 30일
This seems to be a feasible workaround. So, the important thing to realize is that even though the iterative display says the Func-count=2, the call to the objective function is doing no significant work after the first function call, because the externally scoped stopflag has been raised by that point.
doOptimization()
Norm of First-order Iteration Func-count Resnorm step optimality 0 1 4225 1.04e+03 1 2 0 4.0625 0 Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance.
x = 6.0625
res = 0
function doOptimization
clc
[stopflag,r0,J0]=deal(0);
opts=optimoptions('lsqnonlin','Display','iter',...
'SpecifyObjectiveGradient',true,'MaxIterations',0);
[x,res]=lsqnonlin(@resid, 2,[],[],opts)
function [r,J]=resid(x)
if ~stopflag
r=(x-10)^2+1; J=2*(x-10); %normal evaluation of residual function
r0=zeros(size(r)); %important that these be zero (but unclear why).
J0=zeros(size(J));
stopflag=1;
else %do no work
r=r0; J=r0;
end
end
end
  댓글 수: 4
Sargondjani
Sargondjani 2023년 6월 30일
Yeah, that is strange indeed!
Sargondjani
Sargondjani 2023년 6월 30일
Anyway, it works great! Super happy!

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Newton-Raphson Method에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by