Problem with validation check number for MATLAB neural network

조회 수: 4 (최근 30일)
Ady
Ady 2016년 4월 3일
편집: Greg Heath 2016년 4월 5일
Hi all. My neural network is for printed text recognition. I used this dataset: https://archive.ics.uci.edu/ml/datasets/Letter+Recognition My question is: why when data is divided 70% for train, 15% for validation and 15% for test, all graphics are the same, and everytime ''validation check '' = 0 .Training continues until the maximum epochs. This is part from my code:
targets = full(ind2vec(letters)); %matrix 26x16000 targets
inputs = train_set.';% matrix 16x16000 inputs
net= patternnet(40,'traingd');
net.trainparam.epochs = 1300;
net.performFcn = 'mse';
net.performParam.ratio = 0.5;
net.trainParam.goal = 1e-2;
net.trainParam.show = 1;
net.trainParam.lr = 0.1;
net.trainParam.max_fail = 5;
% Choose Input and Output Pre/Post-Processing Functions
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Setup Division of Data for Training, Validation, Testing
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
net = train(net,inputs,targets)
%test
outputs = net(inputs);
errors = gsubtract(targets,outputs);
%sim
sim_attribs = attribs(end-3999:end, :);
check = sim_attribs.';
My second quest is: why graphic ''confusion'' don't work (she is blurred and nothing is visible) ?

채택된 답변

Greg Heath
Greg Heath 2016년 4월 5일
편집: Greg Heath 2016년 4월 5일
% Problem with validation check number for MATLAB neural network
% Asked by Ady on 3 Apr 2016 at 17:06
% Hi all. My neural network is for printed text recognition. I used this dataset:
https://archive.ics.uci.edu/ml/datasets/Letter+Recognition
% My question is: why when data is divided 70% for train, 15% for validation and 15% for test, all graphics are the same,''validation check '' = 0 . Training continues until the maximum epochs.
NOMENCLATURE: N I-dimensional "I"nput vectors corresponding to N O-dimensional
"O"utput target vectors where, for classification, O = C = number of "C"lasses or
"C"ategories and target vectors are columns of the unit matrix eye(C)
[ I N ] = size(input) % [ 16 16000 ]
[ O N ] = size(target) % [ 26 16000 ]
Ntst = round(0.15*N), Nval=Ntst % 2400
Ntrn = N-2*round(0.15*N) % 11,200
Ntrneq = Ntrn*O % 291,200 training equations
1. You have run multiple trials with different initial random weights and random data divisions. Approximately how many trials?
2. Typically, >~ 30 random training samples per input dimension is adequate for characterizing a distribution.
3. Therefore, your Ntrn/I ~ 11200/16 = 700 random samples per input dimension is more than adequate by a factor of ~23.
4. Consequently, you should be surprised if your training data DID NOT adequately characterize your nontraining data!
5. With this much data you could even consider Ntrn = Nval= Ntst with multiple classifiers trained on different random subsets of data.
6. Personally, I would have also considered
a. Reducing the input dimension (help/doc PLSREGRESS )
b. Training with no more than ~50 random input samples per resulting dimension
c. Using as few hidden nodes as possible. With H hidden nodes the number of unknown
weights is Nw = (I+1)*H+(H+1)*O. For Nw < Ntrneq, H <= Hub where the upper bound
is given by
Hub = (Ntrneq-O)/(I+O+1)
Obviously, H << Hub is preferable.
% This is part from my code:
% targets = full(ind2vec(letters)); %matrix 26x16000 targets
% inputs = train_set.';% matrix 16x16000 inputs
% net= patternnet(40,'traingd');
For H = 40,
Hub = (0.7*16000-26)/(16+26+1) = 260 % = 60 *H
==> Don't even need a VAL set
% net.trainparam.epochs = 1300;
Isn't default of 1000 sufficient?
% net.performFcn = 'mse';
??? CROSSENTROPY IS THE DEFAULT FOR CLASSIFICATION
% net.performParam.ratio = 0.5;
With large Ntrn, regularization is unnecessary
% net.trainParam.goal = 1e-2;
% net.trainParam.show = 1;
% net.trainParam.lr = 0.1;
% net.trainParam.max_fail = 5;
Can delete next 9 statements that just assign defaults
% % Choose Input and Output Pre/Post-Processing Functions % net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; % net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'}; % % Setup Division of Data for Training, Validation, Testing % net.divideFcn = 'dividerand'; % Divide data randomly % net.divideMode = 'sample'; % Divide up every sample % net.divideParam.trainRatio = 70/100; % net.divideParam.valRatio = 15/100; % net.divideParam.testRatio = 15/100;
% net = train(net,inputs,targets) % %test % outputs = net(inputs); % errors = gsubtract(targets,outputs);
[ net tr outputs errors] = train(net,inputs,targets);
tr = tr % Reveals training details
% %sim % sim_attribs = attribs(end-3999:end, :); % check = sim_attribs.';
Don't understand the above
% My second quest is: why graphic ''confusion'' don't work (she is blurred and nothing is visible) ?
If you were designed to deal with 2 classes and someone dumped 26 on you, wouldn't you
feel greatful that the only thing wrong with you is blurring?
Curious on how you decided on some of the values of nondefault parameters
... I tend to use as many defaults as possible and minimimize H.
Hope this helps.
*Thank you for formally accepting my answer*
Greg

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by