Minimization of a function with unknown gradient but known sparsity pattern of its hessian
조회 수: 4 (최근 30일)
이전 댓글 표시
Dear colleagues,
is there a fmincon option to minimize a function without the knowledge of its gradient but providing a sparsity pattern of a Hessian?
My function comes from a FEM formulation of an energy in nonlinear mechanics of solids and it is too difficult to differentiate analytically.
However the sparsity pattern of the hessian is easily available though a FEM connectivity of variables.
Is there a way to exploit it efficiently? If I run with 'Algorithm','quasi-newton', it seems not to accept 'HessPattern' option. An alternative would be to obtain an appriximative gradient (can you suggest one?) and use 'Algorithm','trust-region' insteady. Does anyone have experience with it?
Best wishes,
Jan
댓글 수: 0
채택된 답변
Alan Weiss
2019년 10월 29일
Sorry, I am afraid that the available options don't work efficiently for your case. The HessPattern option is available only for the 'trust-region-reflective' algorithm, but for that algorithm you need to supply a derivative.
I am not sure what to suggest that you probably have not yet tried. For the default 'interior-point' algorithm you can try using the HessianApproximation option set to 'lbfgs' or {'lbfgs',Positive Integer}, but that does not directly use the sparsity pattern that you know. Or, and this seems crazy, you could code a finite difference gradient in your objective funtion, bypassing MATLAB's internal one, and then you could use the 'trust-region-reflective' algorithm with the HessPattern option. I am not sure that the 'trust-region-reflective' algorithm would satisfy you anyway, as it accepts only bound constraints or only linear equality constraints.
Sorry.
Alan Weiss
MATLAB mathematical toolbox documentation
댓글 수: 5
Catalytic
2019년 10월 29일
편집: Catalytic
2019년 10월 29일
One possibility might be to use a 1-iteration call to fmincon itself to return the gradient. This uses Matlab's finite differencer and so might be faster than 3rd party implementations,
function [f,numerical_grad]=myObjective(x)
f=...
if nargout>1
options=optimoptions('fmincon','MaxIter',1,'SpecifyObjectiveGradient',false);
[~,~,~,~,~,numerical_grad] = fmincon(@myObjective,x,[],[],[],[],...
[],[],[],options);
end
end
추가 답변 (1개)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!