Training failed: Layer 'classoutput' input size mismatch

조회 수: 7 (최근 30일)
KAI-YANG WANG
KAI-YANG WANG 2021년 8월 11일
댓글: KAI-YANG WANG 2021년 9월 5일
I am training my convolutional neural network by using Deep Network Designer. When I did the training part it shows me: "Training failed: Layer 'classoutput' input size mismatch. Size of input to this layer is different from the expected input size. Inputs to this layers: from layer 'softmax' (output size 32x32x2)" My input contains 100000 sets of 32x32x2 data. Could anyone give me some hint to solve this problem, thank you!
Here is my network structure:
tempLayers = [
imageInputLayer([32 32 2],"Name","imageinput")
convolution2dLayer([9 9],256,"Name","conv_1_1","Padding","same")
averagePooling2dLayer([9 9],"Name","avgpool2d_1","Padding","same")
batchNormalizationLayer("Name","batchnorm_1_1")
leakyReluLayer(0.01,"Name","leakyrelu_1_1")
convolution2dLayer([5 5],256,"Name","conv_2_1","Padding","same")
averagePooling2dLayer([5 5],"Name","avgpool2d_2","Padding","same")
batchNormalizationLayer("Name","batchnorm_2_1")
leakyReluLayer(0.01,"Name","leakyrelu_2_1")
convolution2dLayer([5 5],256,"Name","conv_3_1","Padding","same")
averagePooling2dLayer([5 5],"Name","avgpool2d_3","Padding","same")
batchNormalizationLayer("Name","batchnorm_3_1")
leakyReluLayer(0.01,"Name","leakyrelu_3_1")
transposedConv2dLayer([5 5],2,"Name","transposed-conv_1","Cropping","same")
convolution2dLayer([5 5],256,"Name","conv_1_2","Padding","same")
batchNormalizationLayer("Name","batchnorm_1_2")
leakyReluLayer(0.01,"Name","leakyrelu_1_2")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([5 5],256,"Name","conv_2_2","Padding","same")
batchNormalizationLayer("Name","batchnorm_2_2")
leakyReluLayer(0.01,"Name","leakyrelu_2_2")
convolution2dLayer([5 5],256,"Name","conv_3_2","Padding","same")
batchNormalizationLayer("Name","batchnorm_3_2")
leakyReluLayer(0.01,"Name","leakyrelu_3_2")
convolution2dLayer([5 5],256,"Name","conv_6","Padding","same")
batchNormalizationLayer("Name","batchnorm_5")
leakyReluLayer(0.01,"Name","leakyrelu_5")
convolution2dLayer([5 5],256,"Name","conv_7","Padding","same")
batchNormalizationLayer("Name","batchnorm_6")
leakyReluLayer(0.01,"Name","leakyrelu_6")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","addition")
transposedConv2dLayer([5 5],2,"Name","transposed-conv_2","Cropping","same")
convolution2dLayer([5 5],256,"Name","conv_4","Padding","same")
batchNormalizationLayer("Name","batchnorm_4")
leakyReluLayer(0.01,"Name","leakyrelu_4")
transposedConv2dLayer([5 5],4,"Name","transposed-conv_3","Cropping","same")
convolution2dLayer([9 9],2,"Name","conv_5","Padding","same")
softmaxLayer("Name","softmax")
classificationLayer("Name","output")];
lgraph = addLayers(lgraph,tempLayers);
clear tempLayers;
lgraph = connectLayers(lgraph,"leakyrelu_1_2","conv_2_2");
lgraph = connectLayers(lgraph,"leakyrelu_1_2","addition/in1");
lgraph = connectLayers(lgraph,"leakyrelu_6","addition/in2");

채택된 답변

Prateek Rai
Prateek Rai 2021년 8월 16일
To my understanding, you are trying to train convolutional neural network by using Deep Network Designer but the input to 'classoutput' layer is mismatched.
The 'classoutput' layer is a classification layer and the input data to classification layer must have spatial dimension sizes equal to 1. You have to modify your neural network accordingly.
You can refer to classificationLayer MathWorks documentation page to find more on classification layer. You can also refer to analyzeNetwork MathWorks documentation page to analyze the deep learning network architecture.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by