Error: horzcat CAT arguments dimensions are not consistent. with feedforwardnet and newff

My Code Is:

A = {MinF; MaxF; MeanF; ModeF; MedianF; SDF; EnergyF; KurtosisF; SkewnessF; EntropyF; VarianceF; ZCRF; MeanPowerF; SNRF; CoVF};
X = cell2mat(A);  
B = {MinS; MaxS; MeanS; ModeS; MedianS; SDS; EnergyS; KurtosisS; SkewnessS; EntropyS; VarianceS; ZCRS; MeanPowerS; SNRS; CoVS};
Y = cell2mat(B);
C = {MinZ; MaxZ; MeanZ; ModeZ; MedianZ; SDZ; EnergyZ; KurtosisZ; SkewnessZ; EntropyZ; VarianceZ; ZCRZ; MeanPowerZ; SNRZ; CoVZ};
Z = cell2mat(C);
P = [X Y Z];
% define targets
T = [repmat(a,1,length(X)) repmat(b,1,length(Y)) ...
     repmat(c,1,length(Z))];
net = feedwordwardnet([P,T,3]);
% train net
net.divideParam.trainRatio = 1; % training set [%]
net.divideParam.valRatio = 0; % validation set [%]
net.divideParam.testRatio = 0; % test set [%]
% train a neural network
[net,tr,Y,E] = train(net,P,T);
% show network
view(net)

It leaves an error message:

??? Error using ==> horzcat
CAT arguments dimensions are not consistent.

I replaced feedforwardnet with newff: but same error message appeared.

I also tried solving this problem with:

net = feedforwardnet(3);
net = train(net,P,T);
view(net)

But it also leads to an error:

??? Undefined function or method 'feedforwardnet' for input arguments
of type 'double'.

How should I complete my Training.. please help?

Thanks in Advance

댓글 수: 1

  • Do you use the debugging features?
  • Set &nbsp &gt&gt dbstop on error, run your function and inspect the data, which cause the error.
  • Click the Help above the edit-box, in which you write your question.

댓글을 달려면 로그인하십시오.

 채택된 답변

feedforwardnet requires the Neutral Network toolbox from R2010b or later.

댓글 수: 3

Your P looks like it will be a 15 x 3 array.
We cannot tell what size your T is, as we do not know what a and b and c are in your construction of T. We can deduce that it is probably something by (15+15+15) .
The value 3 is size 1 x 1.
You try to [] together the 15 x 3, the something x 45, and the 1 x 1. That is not going to be valid. If you were to replicate the scalar 3 value to be 15 x 1, then potentially the [] could work, if your a and b and c all happen to be 15 x 1: in that case you would be doing [15 x 3, 15 x 45, 15 x 1] which would be valid and would result in 15 x 49.
However, the syntax for feedforwardnet is feedforwardnet(hiddenSizes,trainFcn) where hiddenSizes has to be a row vector, not a 2D array.
It appears to me that you are attempting to use the feedforwardnet() call to pass in the training and target information right at the beginning. That is not how feedforwardnet are constructed.
Your second attempt
net = feedforwardnet(3);
net = train(net,P,T);
looks more plausible. You report that it complained of no feedforwardnet() for type double. That would suggest you are using R2010a or earlier, or else that you do not have Neural Network Toolbox installed, or else that you do not have it licensed.
Note: if you are using a Student license, then you probably forgot to tell the installer to install Neural Network Toolbox.
Thanks a lot For your assistance.. many things are clear now, but can you please help me to know:
as 3 in syntax use to be the hidden layer size. so is it possible to define hidden layer for neural network in a 15x1 matrix format. as far I understand, 3 must be representing the no of hidden neurons for network then what 15x1 matrix will represent?
and I am working on R2010a. so I hope i should continue with newff as feedforwardnet is not compatible?

댓글을 달려면 로그인하십시오.

추가 답변 (1개)

Greg Heath
Greg Heath 2017년 4월 9일
lowercase a, b and c used in repmat are undefined.
Hope this helps.
Thank you for formally accepting my answer
Greg

댓글 수: 2

Use the command
dir
to make sure that the dimensions of all variables are really what you think they should be.
Hope this helps.
Greg
dir? That would show a list of files.

댓글을 달려면 로그인하십시오.

카테고리

도움말 센터File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by