Multiple curve fitting for parameter identification: multi-objective optimization

조회 수: 2 (최근 30일)
I have 4 plots of x (eps) and y (sigmaexp) and I would like to best-fit them in the equation given by sigmanum2. The criteria to best-fit is to minimize sum of squares i.e. least sum of squares defined by function fitness. I have done this successfully for 1 plot by using fminsearch i.e. 1 objective function. Since I have 4 plots, I have 4 objective functions (or 4 values in y i.e. [F1; F2; F3; F4]) which must be minimized simultaneously. How can I do this or which function must be used for multiobjective optimization? Please help. I have done research from my end in the relevant forums, however I am not able to do it. Here is my code.
s = table2array(ExpDataMatlab); %excel data I have imported. Attached for your reference.
eps = s(:,1); % represents x
sigmaexp = s(:,2); % represent y
epsdot1 = s(:,3);
A = 332.8668; %constant
B = 128.6184; %constant
n = 0.4234; %constant
%x = [2000, 10.2234];
%y = fitness(x,A,B,n,eps,epsdot1,sigmaexp); %I used this to check if my function defined at the
%bottom works properly or not
ObjFcn = @(x)fitness(x,A,B,n,eps,epsdot1,sigmaexp);
x0 = [12000 4];
options = optimset('Display','iter','PlotFcns',@optimplotfval,'TolX',1e-10,'TolFun',1e-10,'Algorithm','interior-point');
[sol, fval, exitflag, output] = fminsearch(ObjFcn,x0,options);
C = sol(1); P = sol(2);
sigmanum4 = (A+B*(eps.^n)).*(1+(epsdot1./C).^(1/P)); %deduced value after optimization
plot(s(1:99,1),s(1:99,2),'+m');hold on;
plot(s(1:99,1),sigmanum4(1:99,1),'m');
hold off;
function y = fitness(x,A,B,n,eps,epsdot1,sigmaexp)
sigmanum2 = (A+B*(eps.^n)).*(1+(epsdot1./x(1)).^(1/x(2)));
F1 = sum((sigmaexp(1:99,1)-sigmanum2(1:99,1)).^2);%values for 982 /s % represent residual sum of squares
%F2 = sum((sigmaexp(100:206,1)-sigmanum2(100:206,1)).^2);%values for 2000 /s
%F3 = sum((sigmaexp(207:296,1)-sigmanum2(207:296,1)).^2);%values for 2900 /s
%F4 = sum((sigmaexp(297:368,1)-sigmanum2(297:368,1)).^2);%values for 4000 /s
%y = [F1; F2; F3; F4]; % for multiobjective minimization
y = [F1]; %for single objective minimization
%y = (0.01042*F1+0.00962*F2+0.01111*F3+0.01389*F4); %weighted residual sum of squares (Milani 2008)
end
Thank you.

채택된 답변

Star Strider
Star Strider 2021년 4월 3일
I am not certain what you want to do.
To fit different parameter sets to different parts of the data, this works:
T1 = readtable('ExpDataMatlab.xlsx', 'VariableNamingRule','preserve');
eps = T1{:,1}; % represents x
sigmaexp = T1{:,2}; % represent y
epsdot1 = T1{:,3};
A = 332.8668; %constant
B = 128.6184; %constant
n = 0.4234; %constant
[Ueps,idx] = unique(epsdot1); % Find ‘Breaks’ In Data
didx = diff([idx; numel(eps)+1]); % Find Lengths Of Data Segments
idxmtx = [idx idx+didx-1]; % Matrix Of Index Segments
objfcn = @(x,idx) (A+B*(eps(idx).^n)).*(1+(epsdot1(idx)./x(1)).^(1/x(2))); % Objective Function
fitness = @(x,idx) norm(sigmaexp(idx) - objfcn(x,idx)); % Fitness Function
x0 = [12000 4];
options = optimset('Display','iter','PlotFcns',@optimplotfval,'TolX',1e-10,'TolFun',1e-10,'Algorithm','interior-point');
for k = 1:size(idxmtx,1) % Data Segment Parameter Estimation Loop
idxrng = idxmtx(k,1):idxmtx(k,2);
[sol(k,:), fval(k), exitflag(k), output{k}] = fminsearch(@(x)fitness(x,idxrng),x0,options);
end
figure
for k = 1:size(idxmtx,1) % Plotting Loop
subplot(4,1,k)
idxrng = idxmtx(k,1):idxmtx(k,2);
plot(eps(idxrng), [sigmaexp(idxrng) epsdot1(idxrng)])
hold on
plot(eps(idxrng), objfcn(sol(k,:),idxrng), '--g')
hold off
grid
ylim([0 5000])
end
It is straightforward to fit the entire data set to one set of parameters, if that is what you want to do.
.
  댓글 수: 5
Venkatesh Madhukar Deshpande
Venkatesh Madhukar Deshpande 2021년 4월 5일
Thank you for all the help. FYI the code which I posted in my 2nd comment also gives me the same answer.

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Get Started with Optimization Toolbox에 대해 자세히 알아보기

제품


릴리스

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by