Question on optimization problem and fminsearch​,fminunc,l​sqnonlin

조회 수: 11 (최근 30일)
Wes
Wes 2012년 6월 5일
Hey all, I am trying to do an optimization problem where I import real life data and try to find the best combination of 6 unknown variables that describe the real life data. The function being run in the optimization call is a series of if/then statements and equations and the output evalutaion is based on the distance difference between real data and the simulated. There are as many equations as variables plus the if/then statements When I use fminsearch the program works just okay but not ideal to find the minimum. When i try fminunc or lsqnonlin, the output basically repeats the initial guess which is not really close to the actual solution. Why are these functions so dependant on the initial guess? Which of these functions should I be using? Any ideas on what I could do to solve this problem in my optimization?
  댓글 수: 2
Sargondjani
Sargondjani 2012년 6월 6일
and as sean notes: fminunc assumes your problem is differentiable... if it is not, than take his advice
but if your problem is differentiable and fminunc exactly returns the initial guess, then something is wrong. you should check the exit message... could be that the maximum number of function evaluations is reached, or something like that
Wes
Wes 2012년 6월 6일
It does not return the exact initial guess. It returns the first two parameters the same as the initial guess, but then changes the last 2-3 parameters to fit. Not sure if that makes any sense or not, but I will one of those other function methods and see how that works.

댓글을 달려면 로그인하십시오.

답변 (2개)

Sean de Wolski
Sean de Wolski 2012년 6월 5일
The initial guess is important because the above mentioned optimizers are trying to find a local minimum, i.e. the one closest to the initial guess that can be achieved using derivatives. From your above description, it sounds like there is a good chance that your function is not differentiable and thus a genetic algorithm, global search or patternsearch is required to find the global minimum. These functions are in the Global Optimization Toolbox:

Geoff
Geoff 2012년 6월 5일
Depending on how localised your minima are, you can sometimes get around this with a simplex-based solver like fminsearch. I start with a large simplex, run the solution and let it converge. Then I reduce the size of the simplex, "shake up" the result (offsetting by the simplex) and let the solution converge again. I repeat this several times. But then, I don't know if fminsearch does this already. Caveat on this is that I was using a Nelder-Mead implementation in C++, not MatLab... I think you may be able to use optimset to configure fminsearch with a bit more of a manual feel.
If you're not time-constrained, you may want to set a large number of random initial guesses, sampled across your solution space, solve each one and choose the best. But given you have 6 unknowns, it doesn't take much partitioning before the problem blows up. And if some variables are unconstrained, this can become quite impractical.

카테고리

Help CenterFile Exchange에서 Solver Outputs and Iterative Display에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by