Question on optimization problem and fminsearch,fminunc,lsqnonlin
조회 수: 8 (최근 30일)
이전 댓글 표시
Hey all, I am trying to do an optimization problem where I import real life data and try to find the best combination of 6 unknown variables that describe the real life data. The function being run in the optimization call is a series of if/then statements and equations and the output evalutaion is based on the distance difference between real data and the simulated. There are as many equations as variables plus the if/then statements When I use fminsearch the program works just okay but not ideal to find the minimum. When i try fminunc or lsqnonlin, the output basically repeats the initial guess which is not really close to the actual solution. Why are these functions so dependant on the initial guess? Which of these functions should I be using? Any ideas on what I could do to solve this problem in my optimization?
댓글 수: 2
Sargondjani
2012년 6월 6일
and as sean notes: fminunc assumes your problem is differentiable... if it is not, than take his advice
but if your problem is differentiable and fminunc exactly returns the initial guess, then something is wrong. you should check the exit message... could be that the maximum number of function evaluations is reached, or something like that
답변 (2개)
Sean de Wolski
2012년 6월 5일
The initial guess is important because the above mentioned optimizers are trying to find a local minimum, i.e. the one closest to the initial guess that can be achieved using derivatives. From your above description, it sounds like there is a good chance that your function is not differentiable and thus a genetic algorithm, global search or patternsearch is required to find the global minimum. These functions are in the Global Optimization Toolbox:
댓글 수: 0
Geoff
2012년 6월 5일
Depending on how localised your minima are, you can sometimes get around this with a simplex-based solver like fminsearch. I start with a large simplex, run the solution and let it converge. Then I reduce the size of the simplex, "shake up" the result (offsetting by the simplex) and let the solution converge again. I repeat this several times. But then, I don't know if fminsearch does this already. Caveat on this is that I was using a Nelder-Mead implementation in C++, not MatLab... I think you may be able to use optimset to configure fminsearch with a bit more of a manual feel.
If you're not time-constrained, you may want to set a large number of random initial guesses, sampled across your solution space, solve each one and choose the best. But given you have 6 unknowns, it doesn't take much partitioning before the problem blows up. And if some variables are unconstrained, this can become quite impractical.
댓글 수: 0
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!