Numerical Technique to approach Global Minimum of a Function

조회 수: 8 (최근 30일)
PASUNURU SAI VINEETH
PASUNURU SAI VINEETH 2022년 12월 4일
답변: Kartik 2023년 3월 21일
I have a function which has 15 input parameters and outputs the Mean Square error of a curvefit. My aim is to find the 15 parameter values whose combination outputs a value close to zero (I'm hoping for 10^(-4)). I have tried implementing Gradient Descent Method, Levenberg-Marquardt algorithm (lsqnonlin) and even solve command. They appear to depend heavily on initial guesses and settle for a local minimum. I'm hoping someone could guide me towards a suitable technique for global minimum, and its implementation. Please let me know if you need more details. Thanks in advance.
  댓글 수: 2
Matt J
Matt J 2022년 12월 4일
편집: Matt J 2022년 12월 4일
They appear to depend heavily on initial guesses and settle for a local minimum.
All methods depend heavily on initial guesses, in general. The question you need to ask is how, for your specific model, do you generate a good initial guess. The answer to that requires us to see the model.
PASUNURU SAI VINEETH
PASUNURU SAI VINEETH 2022년 12월 4일
편집: PASUNURU SAI VINEETH 2022년 12월 4일
@Matt J I have attached a sample trajectory (Trajectory.fig) that I have been trying to fit. The idea is to start from the bounce point and numerically generate forward and backward trajectories from my physics based model. Initial guesses for velocities are taken to be forward positional derivatives and spin guesses are randomised, as I couldn't think of a better way. In this case, I know all the 15 components beforehand so it should be possible to get zero error (perfect fit). But, Fit.fig is the closest I was able to approach by varying stepsize (h) and weight factor (gamma).
P.S. I had to account for the lowest point (x(k),y(k),z(k)) in the parameters because it might not be the actual bounce point in case of noise in the input excel data.
h = 0.001
gamma = 0.001
matrix = randn(1,6)
OriginalData = readmatrix('TrialNew.xlsx');
x=OriginalData(1:end-2,1);
y=OriginalData(1:end-2,2);
z=OriginalData(1:end-2,3);
[k]=find(y==min(y));
%curr_pars = [-2 2 2 0 16 0 0 0 0 2 2 -1 4 8 7];
curr_pars = [100*(x(k-1)-x(k)) 100*(y(k-1)-y(k)) 100*(z(k-1)-z(k)) matrix(1) matrix(2) matrix(3) x(k) y(k) z(k) 100*(x(k+1)-x(k)) 100*(y(k+1)-y(k)) 100*(z(k+1)-z(k)) matrix(4) matrix(5) matrix(6)]; % current point [BackwardVelocityComponents BackwardSpinComponents IntersectionPoint ForwardVelocityComponents ForwardSpinComponents]
initial_error = ErrF15(curr_pars)
gradErrorFunction = zeros(1,numel(curr_pars));
err=100;
ErrorTrend = [];
while err > 0.01
for i_v = 1:1:numel(curr_pars)
cp = curr_pars;
cp(i_v) = cp(i_v) + h;
cm = curr_pars;
cm(i_v) = cm(i_v) - h;
gradErrorFunction(i_v) = (ErrF15(cp) - ErrF15(cm))/(2*h);
curr_pars = curr_pars - gamma*gradErrorFunction ;
err = ErrF15(curr_pars)
end
ErrorTrend = [ErrorTrend err];
end

댓글을 달려면 로그인하십시오.

답변 (1개)

Kartik
Kartik 2023년 3월 21일
Hi,
It sounds like you're dealing with a highly nonlinear optimization problem with many variables, which can be challenging to solve using standard optimization methods. To find a global minimum, you may want to consider using a stochastic optimization algorithm, such as genetic algorithms or particle swarm optimization. These methods are designed to search a large solution space efficiently and can often find global optima.
You can refer the following MathWorks documentation for information regarding PSO in MATLAB:

카테고리

Help CenterFile Exchange에서 Particle Swarm에 대해 자세히 알아보기

제품


릴리스

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by