Finite Difference Method to find gradient?
조회 수: 10 (최근 30일)
이전 댓글 표시
I need to find the gradient of an n-dimensional function to eventually optimize it bu I need to be doing it by finite difference method. I can't use the built-in matlab functions but I have no idea how to code finite difference for n-dimensions. The function should be entered as x(1) x(2) and so on (so that the loops can calculate the gradient) and the dimension of the function will be found from the size of the starting point vector. Can someone help me out in this please? I really don't know how to code this.
댓글 수: 1
Star Strider
2015년 11월 22일
편집: Star Strider
2015년 11월 22일
This Stack Exchange post should get you started: calculate Jacobian matrix without closed form or analytical form.
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Nonlinear Optimization에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!