Testing Neural Network on new data
조회 수: 1 (최근 30일)
이전 댓글 표시
I am quite new to MATLAB and extrmely new to neural networks. I have created a feed forward neural network using newff(). I input a matrix of size 486*1200 and output matrix 6*1200. When I simulate the network using sim() function after training, I get correct output. But when I try to simulate on any other input of shorter size I get error which says input matrix should have 486 rows (as that of training matrix).
Can anyone resolve this issue for me to generalize the network for any size of input? And if I make that matrix of 486 rows by padding extra zeros, it gives incorrect output even though that test matrix is a part of trainig matrix.
Here is my code:
[Pr,Pc] = size(PP);% Pr=486, Pc=1200
TP=eye(6,1200);
[Tr,Tc] = size(TP);
L1 = 0.5*Pc;
net0to9 = newff(minmax(PP),[L1 Tr],{'logsig' 'logsig'},'traingdx');
net0to9.performFcn = 'mse';
net0to9.trainParam.goal = 0;
net0to9.trainParam.show = 20;
net0to9.trainParam.epochs =1200;
[net0to9,Tr] = train(net0to9,PP,TP);
x = sim(net0to9,PP);
I'd be really thankful if anyone could help.
댓글 수: 1
Greg Heath
2011년 11월 4일
Once a net with node topology I-H-O is trained,
It should only be used with input matrices with
I = size(input,1)
Inputs with size
[I Ntst] = size(input)
will yield output matrices of size
[O Ntst] = size(output)
Hope this helps.
Greg
P.S. It is assumed that nontraining data has
the same statistical characteristics as the training data.
채택된 답변
Walter Roberson
2011년 11월 3일
For newff() the samples run down the columns!
Note the documentation says:
P R x Q1 matrix of Q1 sample R-element input vectors
So the P(:,1) is the first sample, P(:,2) is the second sample, and so on.
Therefore, any data you test against must have Pr rows or else the samples will be incomplete.
If you intend that your samples are incomplete, then in order for a meaningful computation to take place, you would somehow have to indicate the mapping between the inputs you have available and the original input positions. I do not know if there is any mechanism for that at all, but if I were designing such a mechanism, the way I would probably expect the user to indicate missing inputs would be by putting NaN in that input location.
I do not know much about NN, but I have seen several people run in to this same problem of not noticing that the samples run down the columns.
추가 답변 (1개)
Greg Heath
2011년 11월 4일
Here is my code:
>[Pr,Pc] = size(PP);% Pr=486, Pc=1200
Pr is probably much larger than necessary. Try input variable subset selection
using PCA (regression) or PLS (classification).
Also standardize the inputs to have zero-mean/unit-variance
>TP=eye(6,1200);
This makes no sense because
eye(6,1200) = [ eye(6), zeros(6, 1194) ]
Please explain your output. Is the problem regression or classification?
>[Tr,Tc] = size(TP);
>L1 = 0.5*Pc;
L1 = 600 is a ridiculous number of hidden nodes.
For an I-H-O net topology with [I N] = size(PP) and [O N] = size(TP), there are Neq = N*O = 1200*6 = 7200 training equations to estimate Nw = (I+1)*H+(H+1)*O = O+(I+O+1)*H weights. For accurate estimates, it is desired that Neq >> Nw or
H << (Neq-O)/(I+O+1) = 7194/493 = 14.6
Search the newsgroup using
heath Neq Nw
for details.
>net0to9 = newff(minmax(PP),[L1 Tr],{'logsig' 'logsig'},'traingdx'); >net0to9.performFcn = 'mse'; >net0to9.trainParam.goal = 0; >net0to9.trainParam.epochs =1200; >[net0to9,Tr] = train(net0to9,PP,TP); >x = sim(net0to9,PP);
1. Why not use as many defaults as is practical? See help newff and doc newff.
2. Why not use a more practical goal? Search the newsgroup using
heath net.trainParam.goal
3. Why not use additional outputs in train to obtain outputs and errors?
Consequently
net0to9 = newff(minmax(PP),[ H O],{'tansig' 'logsig'}); % H <= 14, O =6
net0to9.trainParam.goal = mean(var(TP))/100; % R^2 ~ 0.99
net0to9.trainParam.show = 20;
[net0to9 tr Y E ] = train(net0to9,PP,TP);
>I'd be really thankful if anyone could help.
Hope this helps.
Greg
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!