## Matlab Solver - fmincon, minimization problem with constraint

jisoo jung

### jisoo jung (view profile)

님이 질문을 제출함. 26 Oct 2019
최근 활동 jisoo jung

### jisoo jung (view profile)

님이 답변함. 26 Oct 2019
Hello, I'm currently working on a problem that minimizes the J function below. (Multi objective) Currently the solver is using fmincon in matlab and the algorithm is sqp.
J = w1*J1 + w2*J2 +w3*J3
each fcn J1,J2,J3 is normlized already.
My question here is
For example, J1 = 0.001 J2 = 0.9 J3 = 0.1
I want to assume the situation.
Q1. Although the weight for w1 is greatly increased,
there is a very small tendency to minimize the function for J1.
What could be the reason for this?
Verification for each Fcn J1, J2, J3 is complete.
Q2. And is it a good idea to normalize constraints as well?
Because I thought Gradient C and Ceq could also affect the solver.

로그인 to comment.

## 답변 수: 2

John D'Errico

### John D'Errico (view profile)

님의 답변 26 Oct 2019

" Q1. Although the weight for w1 is greatly increased,
there is a very small tendency to minimize the function for J1.
What could be the reason for this? "
Suppose that J1 is essentially a constant function, at least in the vicinity of the start point, or very nearly so? If the optimizer sees there is essentially no gain in the global composite objective coming from J1, then it makes sense to move in a direction that minimizes the other two sub-objectives.
That COULD be the reason. You could look carefully at each of your sub-objectives. At the start point, what are the corresponding norms of their gradients? If the gradient is zero, then how much change from J1 will you get for any movement from x0?
Essentially, you want to look at the start point for each sub-objective. Then, try optimizing each of them independently. (Or, if this is a low dimensional problem, just plot them all.) If they were independent of each other, where would the optimizer want to go for each sub-objective?
Even though the objectives are apparently "normalized", what probably really matters is how the functions are normalized so the gradients all have similar norms. Becaue you could add some HUGE constant to any one of them, and the gradient would not change.
Q2. And is it a good idea to normalize constraints as well?
It can't hurt. If they are wildly different in magnitude, then expect numerical problems.

로그인 to comment.

jisoo jung

### jisoo jung (view profile)

님의 답변 26 Oct 2019

If this assumtion ( If the optimizer sees there is essentially no gain in the global composite objective coming from J1" ) is true,
How can I resolve this problem ... ?

로그인 to comment.