Optimization: Optimize multiple input variables to minimize the output

조회 수: 42 (최근 30일)
Hello
I am looking to optimize multiple input variables to minimize the output using fminsearch.
Clearly, I am doing it wrong :( ( see below ) Below is my initial attempt.
Ultimately wanted to bound the predictions for all the variables ( x,y,z,p,q,r) from 0.1 to 100 in the step 0.1
Any help will be greatly appreciated.Thanks a ton!
%Objective: Attempting to Minimize function output with respect to multiple input variables
% Wanted to minimize function, Pow(X) = ((x*p) + (y*q) + (z*r) ) *l*w), by
% optimizing the variables, x, y,z,p ,q and r.
%l and w are constants
%Creating the objective function with its extra parameters( l,w) as extra arguments.
f =@(X, l,w)(X(1)*X(4) + X(2)*X(5) + X(3)*X(6))*l*w; %
%Declaring extra parameter values
l =2;
w=1;
%Create an anonymous function of x alone that includes the workspace value of the parameter.
fun =@(X)f(X,l,w)
%x0 = [-1,1.9];
X_guess = [1 1.5 1 2 1.25 1];
Xmin = fminsearch(fun,X_guess)
x1 = Xmin(1);
y1 = Xmin(2);
z1 = Xmin(3);
p1 = Xmin(4);
p2 = Xmin(5);
p3 = Xmin(6);
  댓글 수: 4
Anand Ra
Anand Ra 2021년 10월 3일
편집: Anand Ra 2021년 10월 3일
Apologies.
Below is the code with output (P) and the optimized variables(x1,y1,z1,p1,q1,r1). Clearly they are off the charts.
If I simply execute the function with my initial guesses, the output P=.75. ( added those calculations as well below.)
I am guessing, that I might be incorrect with the sybtaxm but I am unable to determine whats wrong.
Also, guessing, I should bound it (if so, now sure how?) so it doesnt go off charts, espcially I need the optimized variables to remain positive numbers.
%Objective: Attempting to Minimize function output with respect to multiple input variables
% Wanted to minimize function, Pow(X) = ((x*p) + (y*q) + (z*r) ) l*w), by
% optimizing the variables, x, y,z,p ,q and r.
%l and w are constants
%Creating the objective function with its extra parameters( l,w) as extra arguments.
f =@(X, l,w)(X(1)*X(4) + X(2)*X(5) + X(3)*X(6))*l*w; %
%Declaring extra parameter values
l =2;
w=1;
%Create an anonymous function of x alone that includes the workspace value of the parameter.
fun =@(X)f(X,l,w)
fun = function_handle with value:
@(X)f(X,l,w)
X_guess = [1 1.5 1 2 1.25 1];
Xmin = fminsearch(fun,X_guess)
Exiting: Maximum number of function evaluations has been exceeded - increase MaxFunEvals option. Current function value: -302178148141371722286612819344701945253341106128754416021789277177203582000056690981679928524741379537028251648.000000
Xmin = 1×6
1.0e+55 * -0.1661 0.8198 0.0940 -1.8024 -2.2644 0.4914
x1 = Xmin(1);
y1 = Xmin(2);
z1 = Xmin(3);
p1 = Xmin(4);
q1 = Xmin(5);
r1 = Xmin(6);
P= ((x1*p1)+(y1*q1)+(z1*r1))*l*w
P = -3.0218e+110
Anand Ra
Anand Ra 2021년 10월 3일
If I simply execute the function with my initial guesses:
l =2;
w=1;
X = [1 1.5 1 2 1.25 1];
P = (X(1)*X(4) + X(2)*X(5) + X(3)*X(6))*l*w
P = 9.7500

댓글을 달려면 로그인하십시오.

채택된 답변

Walter Roberson
Walter Roberson 2021년 10월 3일
Ultimately wanted to bound the predictions for all the variables ( x,y,z,p,q,r) from 0.1 to 100 in the step 0.1
fminsearch() cannot bound variables. fmincon() can bound variables though.
However, you have discrete variables. fminsearch() and fmincon() cannot handle discrete variables.
You have a few options:
  1. use ga() with each of those variables being marked as having an integer constraint from 1 to 1000 (not 100), and divide each variable by 10 inside the objective function; or
  2. Use ndgrid() to construct all of the possible combinations of inputs, and evaluate the function at all of them and take the minimum of all of the evaluations
  3. recognize that multiplying positive values by positive values and summing them is always going to have its minima when the values are as small as possible, so just take the lower bounds of everything and do not bother optimizing.
  댓글 수: 13
Walter Roberson
Walter Roberson 2021년 10월 8일
%defining optimization variables and an optimization problem object.
a = optimvar('a','LowerBound',0.1,"UpperBound",20);
b = optimvar('b','LowerBound',0.1,"UpperBound",20);
c = optimvar('c','LowerBound',0.1,"UpperBound",20);
d = optimvar('d','LowerBound',0.1,"UpperBound",20);
e = optimvar('e','LowerBound',0.1,"UpperBound",20);
prob = optimproblem;
k= 2;
w=1;
v=1.5;
%constraints
% cons1 = e >= (a+d);
% cons2 = d >=a ;
% cons3 = b <=c ;
% cons4 = (e-d) >= (d-a) ;
% cons5 = (c-b) <= b;
cons1 = e - a- d >= 0.1;
cons2 = d - a >= 0.1 ;
cons3 = c-b >= 0.1;
cons4 = b - a >= 0.1 ;
cons5 = (e-c) >=0.1;
prob.Constraints.cons1 = cons1;
prob.Constraints.cons2 = cons2;
prob.Constraints.cons3 = cons3;
prob.Constraints.cons4 = cons4;
prob.Constraints.cons5 = cons5;
x0.a = 4;
x0.b = 6;
x0.c = 8;
x0.d = 7;
x0.e = 12;
%new variables
AB = sqrt(a.^2 + b.^2);
BC = sqrt( c.^2 + ((e-d)/2).^2 );
CS = sqrt( c.^2 + ((e-d)/2).^2 );
VAB = sqrt(((((a.*v).^2/(((b.^2).*4))) + (v^2)/2 )));
% VBS = sqrt(((a*v)^2/((4*b*b)) + (v^2)/2 ));
VCS = ((2*c)./(e-d)).*sqrt(AB.^2);
VBC= CS.^2 + BC.^2;
%objective function as an expression in the optimization variables.
P = (AB.*VAB + BC.*VBC + CS.*VCS).*k*w;
%the objective function in prob.
prob.Objective = P;
sol = solve(prob, x0)
Solving problem using fmincon. Feasible point with lower objective function value found. Local minimum found that satisfies the constraints. Optimization completed because the objective function is non-decreasing in feasible directions, to within the value of the optimality tolerance, and constraints are satisfied to within the value of the constraint tolerance.
sol = struct with fields:
a: 0.1000 b: 0.2000 c: 0.3000 d: 9.2995 e: 9.6959

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Linear Programming and Mixed-Integer Linear Programming에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by