validation error in neural network

조회 수: 8 (최근 30일)
hossein
hossein 2015년 2월 8일
편집: Greg Heath 2015년 2월 11일
Dear friends I have tried to train a mlp using newff and train functions, but after few stages of training, the validation error stops the training procedure, so I wanna ask for any solution or alternative to prevent the validation error
regards

채택된 답변

Greg Heath
Greg Heath 2015년 2월 11일
편집: Greg Heath 2015년 2월 11일
% 0. One hidden layer is sufficient provided there are enough hidden nodes.
% 1. Use t for target and y for output
% 2. For regression use the current FITNET instead of the obsolete NEWFF.
% 3. As indicated below, Validation Stopping is useful for preventing highly biased training data performance from influencing the design of a net which is ultimately created for use on NONDESIGN data. This is especially important when the number of training equations Ntrneq = Ntrn*O is not sufficiently greater than the number of unknown weights Nw = (I+1)*H+(H+1)*O.
% 4. Data divisions
DATA = DESIGN + NONDESIGN
DESIGN = TRAINING + VALIDATION
NONDESIGN = TEST + UNAVAILABLE
NONTRAINING = VALIDATION + NONDESIGN
% 5. Use the DESIGN TRAINING data to estimate weights. Using the same data to estimate performance can result in highly optimistic biased estimates.
% 6. Use the DESIGN VALIDATION data to obtain less biased performance estimates to
a. Prevent worse performance on NONDESIGN data.
b. Rank the performances of multiple designs
c. Choose the multiple designs used to estimate summary performance statistics
% 7. Assume the UNBIASED NONDESIGN TEST data performance is representative of the performance on UNAVAILABLE data.
% 8. Evaluate performance via the summary statistics of the UNBIASED performance on NONDESIGN (TEST + UNAVAILABLE) data chosen in 6c.
close all, clear all, clc, plt=0
x = -5:.05:5;
t = (2*sin(x).*cos(3*x)+cos(10*x)).*sin(x);
[ I N ] = size(x) % [ 1 201]
[ O N ] = size(t) % [ 1 201]
Ntst = round(0.15*N) % 30
Nval = Ntst % 30
Ntrn = N-Nval-Ntst % 141
plt = plt+1, figure(plt)
plot( x, t, 'LineWidth', 2 )
%17 local minima / 18 local maxima => 36 hidden nodes
H = 36
net = newff( x, t, H );
MSE00 = var(t',1) % 1.0041 Reference MSE
rng(4151941) % For duplicating the design
[ net tr y e ] = train (net, x, t );
hold on
plot( x, y, 'r', 'LineWidth', 2 )
NMSE = mse(e)/MSE00 % 2.5414e-3
R2 = 1-NMSE % 0.99746 Rsquared (See Wikipedia)
tr = tr % No semicolon
stopcrit = tr.stop % Validation stop
R2trn = 1-tr.best_perf/MSE00 % 0.99889
R2val = 1-tr.best_vperf/MSE000 % 0.99542
R2tst = 1-tr.best_tperf/MSE00 % 0.99277
% Using MSEtrn00 instead of MSE00 shouldn't make much difference (Use tr.trainInd to check if you don't believe me)
Hope this helps.
Thank you for formally accepting my answer
Greg

추가 답변 (1개)

Greg Heath
Greg Heath 2015년 2월 9일
The validation stopping occurs because the net is performing badly on nontraining design data. You don't want to overcome it; You want to start designing a better net which works well on BOTH design training data (train) AND design nontraining data (validation).
I typically use a double for-loop to train at least 10 different nets for each of ~Ntrials = 10 candidate values of H = Hmin:dH:Hmax (<=Hub), the number of hidden nodes. The nets differ by the initial state of the random number generator which determines both initial weights AND the data division.
I have posted many examples in the NEWSGROUP and ANSWERS. Search on subsets of
greg, Hmax or Hub, Ntrials, fitnet or patternnet
Hope this helps,
Thank you for formally accepting my answer
Greg
  댓글 수: 1
hossein
hossein 2015년 2월 10일
편집: hossein 2015년 2월 10일
Dear Greg
Thank you for your answer, as far as I understand from your answer, you mean that I should try with different kind of NN by differing the number of hidden layers and nodes, but it didnt work correctly.
Here I prepared a simple code for this problem, and each time I differed the matrix XX1 for differing the number of nudes and number of hidden layer, but each time the validation error stops the training :
x=-5:.05:5;
y=(2*sin(x).*cos(3*x)+cos(10*x)).*sin(x);
net1=newff(x,y,[XX1]);
net1 = train(net1,x,y);
o1=sim(net1,x);

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by