NARXNET: Validation stop
이전 댓글 표시
Hi,
I am using this code to fit the data in the appendix. Since this data set is huge [1x984468] I'm trying to fit a smaller subset of it (1:20000) in the first place. The problem is that I just get "Validation stops" and never reach the "min gradient" even though I'm not exceeding Hmax.
I tried the code on the nndata set "valve_dataset", as well -> results are shown below.
Can anyone see the problem?
Thanks a lot!
plt = 0;
tic
% x = u1(1:20000);
% t = y1(1:20000);
% X = con2seq(x);
% T = con2seq(t);
[X,T] = valve_dataset;
x = cell2mat(X);
t = cell2mat(T);
[ I N ] = size(X);
[ O N ] = size(T);
MSE00 = mean(var(t',1))
MSE00a = mean(var(t',0))
% Normalization
zx = zscore(cell2mat(X), 1);
zt = zscore(cell2mat(T), 1);
Ntrn = N-2*round(0.15*N);
trnind = 1:Ntrn;
Ttrn = T(trnind);
Neq = prod(size(Ttrn));
%%Determine significant lags
%{
plt=plt+1,figure(plt)
subplot(211)
plot(t)
title('SIMPLENAR SERIES')
subplot(212)
plot(zt)
title('STANDARDIZED SERIES')
rng('default')
n = randn(1,N);
L = floor(0.95*(2*N-1))
for i = 1:100
autocorrn = nncorr( n,n, N-1, 'biased');
sortabsautocorrn = sort(abs(autocorrn));
thresh95(i) = sortabsautocorrn(L);
end
sigthresh95 = mean(thresh95) % 0.2194
autocorrt = nncorr(zt,zt,N-1,'biased');
siglag95 = -1+ find(abs(autocorrt(N:2*N-1))>=sigthresh95);
plt = plt+1, figure(plt)
hold on
plot(0:N-1, -sigthresh95*ones(1,N),'b--')
plot(0:N-1, zeros(1,N),'k')
plot(0:N-1, sigthresh95*ones(1,N),'b--')
plot(0:N-1, autocorrt(N:2*N-1))
plot(siglag95,autocorrt(N+siglag95),'ro')
title('SIGNIFICANT SIMPLENAR AUTOCORRELATIONS')
%INPUT-TARGET CROSSCORRELATION
%
crosscorrxt = nncorr(zx,zt,N-1,'biased');
sigilag95 = -1+ find(abs(crosscorrxt(N:2*N-1))>=sigthresh95); %significant feedback lag
%
plt = plt+1, figure(plt)
hold on
plot(0:N-1, -sigthresh95*ones(1,N),'b--')
plot(0:N-1, zeros(1,N),'k')
plot(0:N-1, sigthresh95*ones(1,N),'b--')
plot(0:N-1, crosscorrxt(N:2*N-1))
plot(sigilag95,crosscorrxt(N+sigilag95),'ro')
title('SIGNIFICANT INPUT-TARGET CROSSCORRELATIONS')
%}
FD = 1:1; %Random Selection of sigflag subset
ID = 1:2; %Random selection of sigilag subset crosscorrelation
NFD = length(FD);
NID = length(ID);
MXFD = max(FD);
MXID = max(ID);
Ntrneq = prod(size(t));
Hub = -1+ceil( (Ntrneq-O) / ((NID*I)+(NFD*O)+1));
Hmax = floor(Hub/50);
Hmin = 0;
dh = 1;
Ntrials = 10;
j = 0;
rng(0)
for h = Hmin:dh:Hmax
fprintf(['_____________H %','d/%d_____________\n'],h,Hmax)
j = j+1
if h == 0
net = narxnet( ID, FD, [] );
Nw = ( NID*I + NFD*O + 1 )*O
else
net = narxnet( ID, FD, h );
Nw = ( NID*I + NFD*O + 1 )*h + ( h + 1 )*O
end
Ndof = Ntrn-Nw
[ Xs Xi Ai Ts ] = preparets( net,X,{},T );
ts = cell2mat(Ts);
xs = cell2mat(Xs);
MSE00s = mean(var(ts',1))
MSE00as = mean(var(ts'))
MSEgoal = max( 0,0.01*Ndof*MSE00as/Neq )
MinGrad = MSEgoal/100
net.trainParam.goal = MSEgoal;
net.trainParam.min_grad = MinGrad;
net.divideFcn = 'divideblock';
net.divideParam.trainRatio = 70/100;
net.divideParam.testRatio = 15/100;
net.divideParam.valRatio = 15/100;
for i = 1:Ntrials
net = configure(net,Xs,Ts);
[ net tr Ys ] = train(net,Xs,Ts,Xi,Ai);
ys = cell2mat(Ys);
stopcrit{i,j} = tr.stop
bestepoch(i,j) = tr.best_epoch
MSE = mse(ts-ys)
MSEa = Neq*MSE/Ndof
R2(i,j) = 1-MSE/MSE00s
R2a(i,j) = 1-MSEa/MSE00as
end
end
stopcrit = stopcrit
bestepoch = bestepoch
R2 = R2
R2a = R2a
Totaltime = toc


stopcrit =
Columns 1 through 2
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
Columns 3 through 4
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Minimum gradient...'
'Validation stop.' 'Minimum gradient...'
'Validation stop.' 'Validation stop.'
'Minimum gradient...' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
Columns 5 through 6
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
Columns 7 through 8
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
Column 9
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
R2 =
Columns 1 through 6
0.9109 0.9116 0.9196 0.9156 0.9236 0.9242
0.9109 0.9116 0.9154 0.9213 0.9124 0.9229
0.9109 0.9116 0.9125 0.9185 0.9202 0.9285
0.9109 0.9116 0.9118 0.9230 0.9311 0.9253
0.9109 0.9117 0.9118 0.9201 0.9343 0.9240
0.9109 0.9115 0.9111 0.9125 0.9338 0.9224
0.9109 0.9116 0.9170 0.9177 0.9188 0.9353
0.9109 0.9116 0.9118 0.9137 0.9292 0.9320
0.9109 0.9115 0.9125 0.9129 0.9312 0.9313
0.9109 0.9116 0.9127 0.9286 0.9187 0.9199
Columns 7 through 9
0.9303 0.9336 0.9344
0.9282 0.9146 0.9393
0.9359 0.9305 0.9378
0.9182 0.9306 0.9355
0.9212 0.9384 0.9321
0.9334 0.9195 0.9339
0.9200 0.9374 0.9310
0.9239 0.9334 0.9175
0.9305 0.9201 0.9320
0.9274 0.9394 0.9311
R2a =
Columns 1 through 6
0.9107 0.9112 0.9190 0.9146 0.9223 0.9226
0.9107 0.9112 0.9147 0.9204 0.9110 0.9213
0.9107 0.9112 0.9118 0.9175 0.9189 0.9270
0.9107 0.9112 0.9110 0.9221 0.9300 0.9238
0.9107 0.9113 0.9111 0.9192 0.9332 0.9225
0.9107 0.9112 0.9104 0.9114 0.9327 0.9208
0.9107 0.9112 0.9163 0.9167 0.9175 0.9339
0.9107 0.9112 0.9111 0.9126 0.9281 0.9306
0.9107 0.9112 0.9118 0.9119 0.9300 0.9298
0.9107 0.9112 0.9120 0.9277 0.9174 0.9183
Columns 7 through 9
0.9286 0.9317 0.9322
0.9264 0.9121 0.9373
0.9343 0.9285 0.9357
0.9162 0.9286 0.9334
0.9192 0.9366 0.9298
0.9317 0.9172 0.9317
0.9180 0.9356 0.9287
0.9220 0.9315 0.9148
0.9288 0.9178 0.9297
0.9256 0.9377 0.9288
댓글 수: 2
Christian Hofstetter
2017년 8월 25일
Greg Heath
2017년 9월 13일
Clicked on Data.mat but cannot find it on my machine.
Anyone knows were it is hiding ?
Greg (A failure at computer nerdism)
채택된 답변
추가 답변 (2개)
Greg Heath
2017년 8월 28일
0 개 추천
Validation Stopping prevents overtraining an overfit net.
Although the training error is decreasing, the ability of the net to perform satisfactorily on nontraining data (represented by the validation subset) is decreasing.
Alternatives
1. Choose the best of multiple designs: Minimize the number of hidden nodes subject to an upper bound on the training error. For open loop timeseries I tend to use
MSEgoal = 0.005*mean(var(target',1))
2. Bayesian Regularization via TRAINBR.
3.If the new versions of MATLAB allow it combine TRAINBR and VALSTOPPING.
Hope this helps.
Thank you for formally accepting my answer
Greg
댓글 수: 1
Christian Hofstetter
2017년 8월 28일
Greg Heath
2017년 9월 3일
편집: Greg Heath
2017년 9월 13일
1. The ULTIMATE GOAL OF NN TRAINING is that the performance measures of BOTH
a. Training data
b. Nontraining data that have the same
summary statistics as the training data
are less than a specified upper bound
2. THEREFORE, it doesn't matter whether the training subset error is minimized or not.
3. HOWEVER, the most common way to achieve the goal in 1 is
a. Use a minimization algorithm to reduce the
training subset error.
b. Stop training when either reaches a local
minimum
i. Training subset error
ii. Nontraining validation subset error
Hope this helps.
Thank you for formally accepting my answer
Greg
카테고리
도움말 센터 및 File Exchange에서 Analysis of Variance and Covariance에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!