CNN Training progress plots - Validation accuracy Jumps at last iteration

조회 수: 1 (최근 30일)
Mariam Ahmed
Mariam Ahmed 2019년 1월 30일
답변: Kenta 2020년 7월 16일
Dear collegues,
I'm training a CNN on MATLAB and I noticed what you can see in the figure below. As shown in the training progress plots, the validation accruacy jumps at the very last iteration regardless of what's the number of Epoches used in the traning. It is confusing. What could be the reason for that?
Thank you.
#Epoches = 5
Untitled.png
#Epoches = 10
Untitled.png
Another trail with #Epoches = 10
Untitled.png
  댓글 수: 2
Don Mathis
Don Mathis 2019년 2월 19일
Can you post your network layers and training options?
Mariam Ahmed
Mariam Ahmed 2019년 2월 19일
Yes, there it is,
NumClasses = 2;
nor = batchNormalizationLayer('Name','BN'); %Bactch Normilization Layer
act1 = leakyReluLayer('Name','LeakyRELU'); %Non-linear Activiation Layer
p = averagePooling2dLayer([2 2],'Name','POOL');
outputLayer = classificationLayer('Name','Classify');
s1 = softmaxLayer('Name','Softmax');
%---Dropout layer
Dlayer = dropoutLayer(0.35,'Name','drop1');
%%--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%---Define Layers Activation functions
%%--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
actCONV = act1;
actFC1 = act1;
%%-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%---Start Creating the CNN structure
%%-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%----Number of filters in Conv layer
n_f = 6;
%----Number of units in FC1 layer
f1_numO_CH = 7;
%---Input Layer
inputLayer = imageInputLayer([inputSize1 inputSize2 numCH],'Name','Input');
%---1st Layer "Conv1"
c1_numF = n_f;
c1 = convolution2dLayer([f ff],c1_numF,'WeightL2Factor',0.7,'Name','Conv1');
c1.Weights = randn([f ff numCH c1_numF]) * 0.01;
c1.Bias = zeros([1 1 c1_numF]);
%---FC1
f1_numO = f1_numO_CH;
numFinal = c1_numF*ConvOUT;%;*((inputSize + 2*(-f+1))-5);
f1 = fullyConnectedLayer(f1_numO,'Name','FC1');
f1.Weights = randn([f1_numO numFinal]) * 0.01;
f1.Bias = zeros([f1_numO 1]);
%---FC2
f2_numO = NumClasses;
f2 = fullyConnectedLayer(f2_numO,'Name','FC2');
f2.Weights = randn([f2_numO f1_numO]) * 0.01;
f2.Bias = zeros([f2_numO 1]);
%%-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%---Define the Conv Net Structure
%%-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
convnet=[inputLayer;c1;nor;actCONV;Dlayer;f1;actFC1;f2;s1;outputLayer];
opts = trainingOptions('adam',...
'MaxEpochs',3,...
'MiniBatchSize',2^8,...
'Shuffle','every-epoch',...
'InitialLearnRate',0.05,...
'LearnRateSchedule','piecewise',...
'LearnRateDropPeriod',1,...
'LearnRateDropFactor',0.68,...
'ValidationData',{Xtest,Ytest},...
'ValidationPatience',Inf,...
'Verbose',false,...
'Plots','training-progress');
Thank you.

댓글을 달려면 로그인하십시오.

채택된 답변

Kenta
Kenta 2020년 7월 16일
If your network includes batch normalization layer, the final accuracy and the one during the training process sometimes differ. The reason why it happens is written in detail above. Hope it helps!

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by