How to add batch normalization layers in between Con. layer and RELU layer in Googlenet.? Suggestions to improve accuracy..

조회 수: 2 (최근 30일)
How to add Batch normalization layer in google net in matlab.? The layers are like this.
layers.png
I want to add batchnormalization layer in between conv. layer and ReLU lauer.? This is for image classification task, so do i need to add batchnormalization layer once or after each conv. layer.? For replacing once this piece of code works. But if needed to add after each conv. layer how to do it.?
larray = [batchNormalizationLayer('Name','BN1')
leakyReluLayer('Name','leakyRelu_1','Scale',0.1)];
lgraph = replaceLayer(lgraph,'conv1-relu_7x7',larray);
Accuracy is 65, i need to improve it. Followed the below training options.? Apart from batch normalization layer, chaning any parameter value will the accuracy increases.? Experts suggestions.?
miniBatchSize = 10;
valFrequency = floor(numel(augimdsTrain.Files)/miniBatchSize);
options = trainingOptions('sgdm', ...
'MiniBatchSize',miniBatchSize, ...
'MaxEpochs',7, ...
'InitialLearnRate',3e-4, ...
'Shuffle','every-epoch', ...
'ValidationData',augimdsValidation, ...
'ValidationFrequency',valFrequency, ...
'Verbose',false, ...
'Plots','training-progress');
googlenet.png

답변 (1개)

Sourav Bairagya
Sourav Bairagya 2020년 2월 14일
To add new layers in layergraph object, first add the new layer using 'addLayer' function. Then, you can use 'connectLayers' function to connect then in the layergraph object.. You may leverage this links:
To improve accuracy you can opt for different optimizers, can chnage mini-batch size, epoch and learning rates in 'trainingOptions'.
  댓글 수: 1
Karthik K
Karthik K 2020년 2월 17일
I tried this way.
b1=batchNormalizationLayer('Name','BN1');
b2=batchNormalizationLayer;
lgraph = layerGraph;
lgraph = addLayers(lgraph,BN1);
lgraph = addLayers(lgraph,BN2);
lgraph = connectLayers(lgraph,'BN1','add_1/in1');
lgraph = connectLayers(lgraph,'BN2','add_1/in2');
plot(lgraph)
it gives me error saying, Unrecognized function or variable 'BN1'.
Can you show me here, just adding a bacth normalization layer between Conv. and RELU layer at 7 & 8, 9 & 10. So that i will get the idea.

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

태그

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by