필터 지우기
필터 지우기

Curve Fitting Techniques

조회 수: 4 (최근 30일)
Clement Wong
Clement Wong 2011년 7월 26일
답변: Caleb Downard 2020년 5월 8일
Hello everyone,
I have a project I'm working on which requires that I search a 3-parameter parameter space for a best fit curve. Unfortunately, the curve cannot be described by an explicit function. To generate the best fit, the process I have been using involves varying the 3 parameters, generating a test curve from the parameters, subtracting my experimental data, and then performing an RMS function to search for the lowest RMS value.
I'm wondering if there is any better way to do this, since my current method is a "brute force" method, where I search large sections of parameter space. This ends up taking hours to finish solving (reaching a stable minimum for RMS). For example, I know there is a built in least squares fit in MATLAB, but it requires that you provide a function with a Jacobian. Is there any similar process for non-explicit functions?

채택된 답변

Bjorn Gustavsson
Bjorn Gustavsson 2011년 7월 26일
fminsearch (and functions that use fminsearch such as John d'Errico's fminsearchbnd, and others on the file exchange) does not need explicit derivatives. As long as you can calculate the curve from your parameters you should be able to run a least-square-type minimization with those tools.
HTH
  댓글 수: 2
Clement Wong
Clement Wong 2011년 7월 26일
I'm not sure that that will work either. I'm not looking for the minimum in my function, I'm looking for a best fit. A crude example of what I'm trying to do is such as fitting A*sin(w*t+c*x) where A, w, and c could all change. I have empirical data, and I'm trying to find the best values for 3 parameters to fit my function.
In a little more detail, my function can only be arrived at by solving a system of 12 equations. However, the final form cannot be explicitly stated, so, unlikely the above example, I would not have a "sin(x)" function. When solving the system, I need to plug in test values for each of the parameters, solve, and then compare the resulting values to each value in my empirical data set. Then I find RMS of the comparison.
Bjorn Gustavsson
Bjorn Gustavsson 2011년 7월 27일
Sure you're looking for a minimum of your _final_ function in the fitting of f(p,x,t) = p(1)*sin(p(2)*t+p(3)*x) to your empirical data (Y). What you do then is (in very mixed notation!):
min_p sum((f(p,x,t)-Y).^2)
That is a minimization. In matlab this is easily done:
p_optimal = fminsearch(@(p) sum((f_of_12eqs(p,x,t)-Y).^2),p0)
If you can automate the solving of your 12 equation, and those solutions are continuous and piecewise smooth in the parameters p then it is possible to make that into a function that will gove you Ymodel as a function of p, given x and t - and you're good to go.

댓글을 달려면 로그인하십시오.

추가 답변 (1개)

Caleb Downard
Caleb Downard 2020년 5월 8일
Super late on this but the regress function could work. it preforms regressional analysis on data sets so its a good way to find fitting constants. I'm not sure how it would work on trig functions because I'm not a math guy. I use it for analysising data sets with unkown fitting constants. I just used it to fit a fuction that was of the form y = m1*x1+m2*x2+c were x1 and x2 were two different arrays of data.

카테고리

Help CenterFile Exchange에서 Fit Postprocessing에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by