why mini batch accuracy (value) graph of training is goes down during training process?

 채택된 답변

Shishir Singhal
Shishir Singhal 2020년 5월 22일

0 개 추천

Hi,
Mini batch accuracy should likely to increase with no. of epochs.
But for your case, there can be of multiple reasons behind this:
  • Mini-batch size
  • Learning rate
  • cost function.
  • Network Architechture
  • Quality of data and lot more.
It would be better if you provide more information about the NN model you are using.
If your case is similar like that.

댓글 수: 1

Raza Ali
Raza Ali 2020년 5월 22일
Thank you for your reply.
plz see the detail below
Network = [
imageInputLayer([256 256 3],"Name","imageinput")
convolution2dLayer([3 3],32,"Name","conv_1","BiasLearnRateFactor",2,"Padding","same")
reluLayer("Name","relu_1")
batchNormalizationLayer("Name","batchnorm")
convolution2dLayer([3 3],64,"Name","conv_2","BiasLearnRateFactor",2,"Padding","same")
reluLayer("Name","relu_2")
transposedConv2dLayer([3 3],2,"Name","transposed-conv","Cropping","same")
softmaxLayer("Name","softmax")
dicePixelClassificationLayer("Name","dice-pixel-class")];
options = trainingOptions('sgdm', ...
'LearnRateSchedule','piecewise',...
'LearnRateDropPeriod',10,...
'LearnRateDropFactor',0.3,...
'Momentum',0.9, ...
'InitialLearnRate',1e-3, ...
'L2Regularization',0.005, ...
'MaxEpochs',30, ...
'MiniBatchSize',2, ...
'Shuffle','every-epoch', ...
'VerboseFrequency',2,...
'Plots','training-progress');

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

도움말 센터File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

질문:

2020년 5월 18일

댓글:

2020년 5월 22일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by