Error in evaluating a polynomial model as a function of its variables
조회 수: 1 (최근 30일)
이전 댓글 표시
Hi I used polyfitn function to my 8 independent data and one dependent variable .I would like to use this function to fill the missing values.So I run the function for the data that I have and after getting the function I run the polyval function but I am getting this error
"Error using polyvaln (line 39) Size of indepvar array and this model are inconsistent."
the polynomial model is like this:
p =
ModelTerms: [495x8 double]
Coefficients: [1x495 double]
ParameterVar: [1x495 double]
ParameterStd: [1x495 double]
DoF: 879
p: [1x495 double]
R2: 0.7664
AdjustedR2: 0.6351
RMSE: 8.0472
VarNames: {'X1' 'X2' 'X3' 'X4' 'X5' 'X6' 'X7' 'X8'}
Any help would be appreciated.
댓글 수: 0
답변 (2개)
the cyclist
2016년 2월 24일
편집: the cyclist
2016년 2월 24일
Have you carefully read the documentation of these functions, and are you certain you are calling them correctly? John D'Errico's submissions are typically flawless.
Almost certainly, you have some kind of dimension mismatch. It looks like you should be calling polyvaln like this:
ypred = polyvaln(p,X)
where X is an N-by-8 numeric array (if I understand the syntax correctly).
May I suggest you post your code and a small sample that exhibits the problem? Otherwise, we'll just be guessing at the solution.
댓글 수: 4
John D'Errico
2016년 2월 24일
편집: John D'Errico
2016년 2월 24일
Now that I have your data, I see that this is a problem that is classically a bit nasty. I'm not amazed. 8 independent variables are a difficult problem to work with.
The problem is, MANY of those coefficients in the model generated by polyfitn are worthless for prediction. So lets take a look at whether polyfitn thinks those terms are useful in the model, or if they are just dead wood.
mdl = polyfitn(x,y,4);
hist(mdl.p,100)
A simple scheme, but terms with a low value of p might be deemed statistically unlikely to be zero. Large values of p here indicate a term that MAY possibly be unnecessary in the model. So in fact, many of those terms are not useful as predictors. The problem is, if they are in the model, they still do SOMETHING to your predictive ability away from the actual data points.
Essentially, I think you are over-fitting the data. What happens is those useless terms now do squirrelly things to the predictor between the data points.
E = mdl.ModelTerms;
nt = size(E,1)
nt =
495
size(x)
ans =
1374 8
To estimate 495 coefficients from 1374 data points is pushing the limits of what can be done. A tool like a stepwise regression tool might help to resolve which of those terms are actually of any predictive utility. Luckily, stepwise is in the stats toolbox.
A = zeros(1374,494);
for i = 1:(nt-1)
A(:,i) = prod(bsxfun(@power,x,E(i,:)),2);
end
stepwise(A,y,[])
This generated a set of 7 predictors that seem to be clearly significant, with a final R^2 of roughly 0.11. Not a terribly good fit. Pushing stepwise a bit harder, by including 101 terms plus a constant, I can get the R^2 up to 0.49.
stepwise(A,y,[],.1,.2)
stats
stats =
intercept: -6.9961
rmse: 12.333
rsq: 0.49196
adjrsq: 0.45162
fstat: 12.195
pval: 5.1172e-127
As a check to see if polyfitn agrees:
mdlterms = E([in1,495],:);
p102 = polyfitn(x,y,mdlterms)
p102 =
ModelTerms: [102x8 double]
Coefficients: [1x102 double]
ParameterVar: [1x102 double]
ParameterStd: [1x102 double]
DoF: 1272
p: [1x102 double]
R2: 0.49196
AdjustedR2: 0.45162
RMSE: 11.867
VarNames: {'' '' '' '' '' '' '' ''}
However, I cannot test this model to see how well it would do, since you did not include fg in the test2.mat file. Admittedly, I don't expect it to do terribly well.
참고 항목
카테고리
Help Center 및 File Exchange에서 Fit Postprocessing에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!