Neural network for curve fitting (estimating function parameters)
조회 수: 40 (최근 30일)
이전 댓글 표시
Hi, I have a question regarding Neural network fitting.
I have curves where I know the curve prescription, but I don't know the parameters (let's say, for example, the curve prescription will be y = a*x^2+b*x+c), and there is some noise in these data.
I need to estimate the parameters a,b,c and the way I am currently doing it is through minimizing square error through fminsearch, but it sometimes has trouble fitting the curve properly (my real data are quite noisy and the curve has 6 parameters). So I wanted to try a different approach and Neural fitting looks quite promising, but I have never done anything with neural networks, so I wanted to ask, if it could be possible to use it for estimating the parameters of a curve (I have a lot of simulated data, where I know the parameters and it could be used for training). Also, in Neural network fitting app, as a input, should I use values of y as input and values of a,b,c as target?
Thank you
댓글 수: 0
답변 (1개)
Bjorn Gustavsson
2023년 2월 6일
In any type of parameter fitting problem you're facing the curse of dimensionality in one form or the other, because the number of dimensions in your search-space increase with the number of parameters. To evaluate your error-function for all combinations of three values per parameter goes from 3 for 1 parameter, to 9 for 2 and 729 for 6 parameters. In addition the optimization will typically have a more complex shape of the error-function to search through, often with different regions of attraction to different local minima. Therefore the search becomes increasingly difficuly the larger number of parameters you fit. One additional stumbling-stone is if your parameters have one redundant parameter (or nearly redundant), for example a model-function of the form:
(only for illustrating purposes) where the coefficient for the linear term has a redundant parameter, this often makes the optimization waste time trying to find optimal values for these two parameters in vain.
Trying to solve these types of issues by turning to a neural network appears to me to be turning to one "more general black-box" machine instead of the black-box machines designed to solve these types of problems, that seems the wrong way to go.
My suggestion is instead that you should try to use a more considered aproach to the fitting:
1, make sure your parameters are truly non-redundant. (this is a pratfall that stumped me a couple of times before I realized I couldn't be that lazy).
2, Try to constrain the search-space, positivity-constraints, range of any of the parameters, anything that helps reducing the parameter-space should help. If you don't have the optimization toolbox you can still use:
3, try to turn to lsqnonlin instead of fminsearch, it often is more efficient. The only thing you need to change is to convert your error-function from returning the sum of squared residuals to a residual-function returning the individual residuals.
4, Definitely try multi-start optimization, this to avoid getting trapped in a local minima.
HTH
댓글 수: 2
Bjorn Gustavsson
2023년 2월 7일
On point 1: some times I've had problems where one or a few parameters have been "nearly redundant" or marginally possible to estimate. For example fitting a 2-sided Gaussian with different widths, where the the two widths and x0 makes the difference in widths close to redundant in some cases. For such cases I've had use of stepwise sub-space fitting - first a normal Gaussian, then using those parameters as initial guess for the 2-sided Gaussian fit that could be more sensibly constrained. Maybe this situation applies to your case.
참고 항목
카테고리
Help Center 및 File Exchange에서 Statistics and Machine Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!