Error using trainnet (line 46)
조회 수: 50 (최근 30일)
이전 댓글 표시
Dear sir;
My XTrain 48941x1 cell and TTrain 48941x1 categorical as a shown at below

why does I get this error?
Error using trainnet (line 46)
Number of observations in predictors (48941) and targets (1) must match. Check that
the data and network are consistent.
layers = [
sequenceInputLayer([30 30 1],'Name','input') % For 2-D image sequence input, InputSize is vector of three elements [h w c], where h is the image height, w is the image width, and c is the number of channels of the image.
convolution2dLayer(5,8,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,8,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,8,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,16,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,16,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,32,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
globalAveragePooling2dLayer(Name="gap1")
fullyConnectedLayer(7)
softmaxLayer];
options = trainingOptions("adam", ...
MaxEpochs=4, ...
InitialLearnRate=0.002,...
MiniBatchSize=128,...
GradientThreshold=1, ...
LearnRateSchedule="piecewise", ...
LearnRateDropPeriod=20, ...
LearnRateDropFactor=0.8, ...
L2Regularization=1e-3,...
Shuffle="every-epoch", ...
Plots="training-progress", ...
ObjectiveMetricName="loss", ...
OutputNetwork="best-validation", ...
ValidationPatience=5, ... % Specify the validation patience as 5 so training stops if the recall has not decreased for five iterations.
ValidationFrequency=50, ...
Verbose=false, ...
Metrics="accuracy", ...
ValidationData={XValidation,TValidation});
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
댓글 수: 3
Walter Roberson
2025년 10월 15일 21:28
In order to test we would need corresponding XValidation and TValidation
채택된 답변
Matt J
2025년 10월 16일 1:16
편집: Matt J
2025년 10월 16일 2:15
It appears that if your XTrain is in cell array form, you need to put your TTrain data in cell form as well:
load('attachedData.mat'); clear ans; whos %Inventory
TTrain=num2cell(TTrain);
options.Plots='none'; %Online environment doesn't support plots
options.Verbose=true;
options.ValidationData={XTrain,TTrain}; %Fake validation data
testPrediction=minibatchpredict(dlnetwork(layers), XTrain(1:3)) %test
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
댓글 수: 2
Matt J
2025년 10월 16일 2:15
편집: Matt J
2025년 10월 16일 13:32
You are using a sequenceInputLayer, but your training inputs appear to just be 30x30 images. An imageInputLayer might be more appropriate...
load('attachedData.mat');
XTrain=cat(4,XTrain{:});
layers(1)=imageInputLayer([30,30,1],Name="input");
options.Plots='none'; %Online environment doesn't support plots
options.Verbose=true;
options.ValidationData={XTrain,TTrain}; %Fake validation data
testPrediction=minibatchpredict(dlnetwork(layers), XTrain(:,:,:,1:3)) %test
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Image Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!