NN accuracy on test set low
이전 댓글 표시
I have implemented a neural network in Matlab R2013a for character recognition. I have used trainbr function for nn training. 80% samples were used for training and the rest for testing. When i plot the confusion matrix, i get 100% accuracy on the training set. But for the test set the accuracy is very low(around 60%). What could be possibly wrong?
채택된 답변
추가 답변 (3개)
Greg Heath
2014년 3월 13일
2 개 추천
Insufficient info:
How many characters?
How many examples for each character?
What are the dimensions of the input and target matrices?
Are the summary statistics of the training and test subsets sufficiently similar?
How many input, hidden and output nodes?
What values of hidden nodes did you try ?
How many random weight initializations for each value ?
Although trainbr should mitigate the effect of using more hidden nodes than are needed, you still need many trials to establish sufficient confidence intervals.
Hope this helps.
Thank you for formally accepting my answer
Greg
댓글 수: 6
Anitha
2014년 3월 13일
Greg Heath
2014년 3월 14일
Sorry, that does not make sense to me. Consider the following
13 characters of the alphabet A-to-M
234 examples, 18 for each character
All characters are columnized 8x5 images
size(input) = [ 40 234]
size(target) = [ 13 234] % columns of eye(13)
Where did I go wrong?
Anitha
2014년 3월 15일
Greg Heath
2014년 3월 15일
[ I N ] = [ 18 234]
[ O N ] = [ 13 234]
The default trn/val/tst split for trainbr is 0.8/0.0/0.2. The resulting number of training equations is
Ntrn = N - round(0.2*N) % 187
Ntrneq = Ntrn*O % 2431
With H=30 hidden nodes, the number of unknown weights is
Nw = (I+1)*H+(H+1)*O % 973
The ratio is
r = Ntrneq/Nw % ~2.5
Which should be ok for trainbr.
I suggest making multiple designs (20?) in a loop with different mixes of training examples, testing examples and initial weights. For examples of multiple designs in a loop, search using
greg Ntrials
Post your code if you still have problems.
Greg Heath
2014년 3월 19일
Two mistakes
1. No configure statement in the loop
2. Used net instead of bestnet in the last train statement
Greg Heath
2014년 3월 16일
1. Not necessary to specify default process functions.
2. How did you know my birthdate is 4151941 ??
3. You are reusing the same net for each trial without using CONFIGURE.
Therefore, the initial weights of each trial are the final weights of the last trial.
I suspect that if the design results are not monotonically better it is because TRAIN is
using a new trn/tst division.
4. Use configure after the RNG initialization.
5. An alternate approach is to CONTINUALLY save one or all of
a. the best current RNG state
a. the best current net
b. the best current Wb = getwb(net)
6. I think you should do all three at the same time and compare results
Hope this helps.
Thank you for formally accepting my answer
Greg
댓글 수: 2
Anitha
2014년 3월 17일
Greg Heath
2014년 3월 19일
The second train statement contains net instead of bestnet
Anitha
2014년 3월 19일
0 개 추천
댓글 수: 1
Greg Heath
2014년 3월 20일
See my second post in the MATLAB Central thread.
카테고리
도움말 센터 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!