Invalid training data. X and Y must have the same number of observations.
조회 수: 7 (최근 30일)
이전 댓글 표시
Good morning,
I'm trying to train a neural network but "Invalid training data. X and Y must have the same number of observations." error came out. I am not able to understand where is the problem.
XTrain is a 24x4x1x1321 double and loadYTrain is 1321x1 double (I have also tryed to train with its transpose: 1x1321).
Can you please help me out, it's very important.
Thank you very much to all in advance.
Here is my code:
layers=[...
sequenceInputLayer([24 4 1],'Name','input')
sequenceFoldingLayer('Name','fold')
convolution2dLayer(3,32,'Padding','same','Name','2d1')
batchNormalizationLayer('Name','bn1')
reluLayer('Name','relu1')
maxPooling2dLayer(2,'Name','pool1')
convolution2dLayer(3,64,'Padding','same','Name','2d2')
reluLayer('Name','relu2')
batchNormalizationLayer('Name','bn2')
maxPooling2dLayer(2,'Name','pool2')
convolution2dLayer(3,128,'Padding','same','Name','2d3')
reluLayer('Name','relu3')
batchNormalizationLayer('Name','bn3')
maxPooling2dLayer(2,'Name','pool3')
convolution2dLayer(3,256,'Padding','same','Name','2d4')
reluLayer('Name','relu4')
batchNormalizationLayer('Name','bn4')
sequenceUnfoldingLayer('Name','unfold')
flattenLayer('Name','flat')
lstmLayer(4,'Name','lstm1','OutputMode','sequence')
dropoutLayer(0.5,'Name','drop1')
lstmLayer(8,'Name','lstm2','OutputMode','sequence')
dropoutLayer(0.5,'Name','drop2')
lstmLayer(16,'Name','lstm3','OutputMode','sequence')
dropoutLayer(0.5,'Name','drop3')
lstmLayer(32,'Name','lstm4','OutputMode','sequence')
dropoutLayer(0.5,'Name','drop4')
fullyConnectedLayer(1,'Name','full')
reluLayer('Name','relu6')
regressionLayer('Name','reg')];
maxEpochs = 30;
miniBatchSize = 128;
options = trainingOptions('sgdm', ...
'MiniBatchSize',miniBatchSize, ...
'MaxEpochs',maxEpochs, ...
'InitialLearnRate',1e-3, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropFactor',0.1, ...
'LearnRateDropPeriod',20, ...
'Shuffle','every-epoch', ...
'Verbose',false);
lgraph = layerGraph(layers);
lgraph = connectLayers(lgraph,'fold/miniBatchSize','unfold/miniBatchSize');
net = trainNetwork(XTrain,loadYTrain,lgraph,options);
댓글 수: 0
채택된 답변
Srivardhan Gadila
2020년 4월 19일
In order to model the LSTM regression networks with 2-D data, the input should be a Nx1 cell, N being the number of observations. Each cell entry should then comprise a HxWxCxS array, where H = height, W=width, C=channels and S= sequence length. The responses, Y, can be a NxR matrix, where N = observations, R = number of responses (or output of the network) for sequence-to-one problems or a Nx1 cell array of RxS responses for sequence-to-sequence problems.
Refer to Input Arguments of trainNetwork, then check sequences input argument explanation on what should be the format and shape of XTrain & Y input argument explanation on what should be the format and shape of the YTrain.
The following is a similar Question: CNN and LSTM error with input size
댓글 수: 3
Srivardhan Gadila
2020년 4월 19일
Refer to Plots and Display and add the name value pair arguments 'Plots','training-progress' to the trainingOptions
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!