Residual values for a linear regression fit

조회 수: 8 (최근 30일)
NA
NA 2020년 10월 16일
댓글: Star Strider 2020년 10월 17일
I have these points
x = [1,1,2,2,3,4,4,6]';
y = [8,1,1,2,2,3,4,1]';
I want to remove the point from above set that makes the residual largest.
This is the code I use
d=zeros(length(x),1);
for i=1:length(x)
x_bk = x;
y_bk = y;
x(i) = [];
y(i) = [];
X = [ones(length(x),1) x];
b = X\y;
yhat = X*b;
d(i) = abs(sum(y - yhat));
x = x_bk;
y = y_bk;
end
index = find(min(d)==d);
x(index) = [];
y(index) = [];
X = [ones(length(x),1) x];
b = X\y;
yhat_r = X*b;
plot(x,y,'o')
hold on
plot(x,yhat_r,'--')
I think the result should be black line (attached file), but I get red dashed line.

채택된 답변

Star Strider
Star Strider 2020년 10월 16일
I would do something like this:
x = [1,1,2,2,3,4,4,6]';
y = [8,1,1,2,2,3,4,1]';
xv = x;
yv = y;
for k = 1:numel(x)
X = [xv(:), ones(size(xv(:)))];
b = X \ yv(:);
yhat = X*b;
rsdn(k) = norm(yv - X*b);
xv = x;
yv = y;
xv(k) = [];
yv(k) = [];
end
figure
plot((1:numel(x)), rsdn)
grid
[rsdnmin,idxn] = min(rsdn(2:end));
[rsdnmax,idxx] = max(rsdn(2:end));
lowest = idxn+1
hihest = idxx+1
idxv = [lowest; hihest];
figure
for k = 1:2
subplot(2,1,k)
xv = x;
yv = y;
xv(idxv(k)) = [];
yv(idxv(k)) = [];
plot(xv,yv,'ob')
yhat = [xv(:), ones(size(xv(:)))]*bmtx(:,idxv(k));
hold on
plot(xv, yhat, '--r')
hold off
title(sprintf('Eliminating Set %d', idxv(k)))
end
Here, the norm of residuals (the usual metric) is least when eliminating ‘row=2’, and greatest when eliminating ‘row=6’.
Experiment to get the result you want.
  댓글 수: 6
NA
NA 2020년 10월 17일
I want to show that if I remove only one set of data the regression line changes a lot. (But I do not know, this is practically true or not).
For this reason, I make this set:
a0 = 4.5882;
a1 = 0.2353;
x = (0:1:8)';
y = a0+a1*x+randn(size(x));
But, it does not show any difference (please see the attachment). I think the way of producing data set is not correct.
Star Strider
Star Strider 2020년 10월 17일
In that simulation, you are defining a particular slope and intercept and adding a normally-distributed random vector to it. The slopes and intercepts of the fitted lines will not change much.
You can see that most easily if you add this text call to each plot (in the loop):
text(1.1*min(xlim),0.9*max(ylim), sprintf('Y = %.3f\\cdotX%+.3f',bmtx(:,k)), 'HorizontalAlignment','left')
That will print the regression equation in the upper-left corner of each one. You can then compare them.
Note that the residual norms do not change much, either. In the original data set, they varied between 2.73 and 5.97. In this data set, they are within about ±0.5 of each other.

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Multiple Linear Regression에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by