Is there any gradient descent method available?
조회 수: 191 (최근 30일)
이전 댓글 표시
We are working on the optimization of nonconvex energies in mechanics of solid (see the attached picture) resulting from the finite element discretization with a moderate number of variables (up to several thousands).
At the moment I am using the function fminunc. It typically converges very slow or not at all (for more variables) taking many iterations:
First-order
Iteration Func-count f(x) Step-size optimality
0 1781 0.4 0.00625
1 5343 0.39987 0.168734 0.03
2 7124 0.399645 1 0.0235
3 8905 0.398947 1 0.0616
4 10686 0.398602 1 0.0458
5 12467 0.398094 1 0.0459
6 14248 0.397673 1 0.059
7 16029 0.397249 1 0.0415
8 17810 0.396635 1 0.0404
9 19591 0.396317 1 0.0847
10 21372 0.395881 1 0.0523
some iteration between and
First-order
Iteration Func-count f(x) Step-size optimality
60 110422 0.310397 1 0.128
61 112203 0.307966 1 0.121
62 113984 0.305174 1 0.235
63 117546 0.303857 0.241185 0.128
64 121108 0.302965 0.1 0.118
65 124670 0.301738 0.1 0.0921
66 229749 0.301464 0.0188263 0.181 Local minimum possible.fminunc stopped because it cannot decrease the objective function
along the current search direction.
We have no information on the gradient (the subgradient computation is possible, but theoretically demanding to derive, there are some nondifferentiable terms etc.) and consequently no information on the second gradient (both are evaluated numerically) and applied the setup
options = optimoptions('fminunc','Algorithm','quasi-newton','Display','iter');
We are thinking with my colleague, the quasi-newton method is a bit overkill for our problem, and we would like to run only few iterations of the gradient descent (possibly with a damping) to get a picture what happens around our initial energy.
It there any implementation of the method able to evaluate the numerical gradient and maybe compare it with our theoretically derived gradient later? It would be perfect, if fminunc had this option but I could not find it.
Thank you, Jan Valdman
댓글 수: 0
채택된 답변
Alan Weiss
2018년 10월 15일
Indeed, fminunc has a mainly-undocumented gradient descent feature that you can see demonstrated in this example. Usually, gradient descent does not work very well, but I suppose that you already know that.
To check whether the internally-calculated gradients in fminunc match a gradient function at the initial point you can use the CheckGradients option. If you want to get a numerical approximation to your gradients you can use John D'Errico's file exchange contribution Adaptive Robust Numerical Differentiation, though on second thought this might not be exactly suited to your problem.
Good luck,
Alan Weiss
MATLAB mathematical toolbox documentation
댓글 수: 2
Alan Weiss
2021년 11월 24일
Indeed, if you can provide a gradient and Hessian in general (not just at the initial point), then you can expect your optimization to proceed more quickly and in fewer iterations. For an example, see https://www.mathworks.com/help/optim/ug/symbolic-math-toolbox-calculates-gradients-and-hessians.html#SymbolicMathToolboxCalculatesGradientsAndHessiansExample-10
But if you can provide the Hessian only at the initial point, then I do not know how you can incorporate the information into the solver. Sorry.
Alan Weiss
MATLAB mathematical toolbox documentation
추가 답변 (0개)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!