Dimension of Weights in a Fully Connected Layer?

조회 수: 1 (최근 30일)
Jon Mitchell
Jon Mitchell 2019년 4월 1일
I am trying to use class activation maps with my series network for classifying images. My layers:
layers = [
imageInputLayer([224 224 3])
convolution2dLayer([8 8],16,'Padding','same','Name','Conv1')
batchNormalizationLayer
reluLayer('Name','Relu1')
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer([4 4],25,'Padding','same','Name','Conv2')
batchNormalizationLayer
reluLayer('Name','Relu2')
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer([2 2],36,'Padding','same','Name','Conv3')
batchNormalizationLayer
reluLayer('Name','Relu3')
averagePooling2dLayer(2,'Stride',2)
fullyConnectedLayer(5,'Name','Full')
softmaxLayer('Name','Soft')
classificationLayer];
When I read activations for an example image from the averagePooling2dLayer, I get an activation matrix of 28x28x36, which I interpret as a qty of 36 activation images that are 28x28. However, when I look at weights in the next fully connected layer, I get a matrix of 5 x 28,224. 28,224 is 28x28x36. I was expecting a weight for each of the 36 activation images from the previous layer for each classification, but it seems to be giving me a weight for each of the 28x28x36 activation 'pixels'. Am I missing something here? I want to see the mapping from the 36 pooling layer activations to the final 5 classifications. I would expect that to be 36x5 weights.
I can reshape the weights into a [5 28 28 36] matrix. Do I need to take the mean over the 28x28 dimensions as in:
squeeze(mean(weights,[2 3]))
to get the 36x5 weights?

답변 (0개)

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

태그

제품


릴리스

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by