Different deep learning training behavior between MATLAB 2020a and 2021b

조회 수: 3 (최근 30일)
Ashley
Ashley 2022년 2월 25일
댓글: Ashley 2022년 2월 28일
I have been using this code to train semantic segmentation networks:
function train_deeplab(pth,classes,classNames,sz)
pthTrain=[pth,'training\'];
pthVal=[pth,'validation\'];
% make training datastore
Trainim=[pthTrain,'im\'];
Trainlabel=[pthTrain,'label\'];
imdsTrain = imageDatastore(Trainim);
pxdsTrain = pixelLabelDatastore(Trainlabel,classNames,classes);
pximdsTrain = pixelLabelImageDatastore(imdsTrain,pxdsTrain);
tbl = countEachLabel(pxdsTrain);
% make validation datastore
Valim=[pthVal,'im\'];
Vallabel=[pthVal,'label\'];
imdsVal = imageDatastore(Valim);
pxdsVal = pixelLabelDatastore(Vallabel,classNames,classes);
pximdsVal = pixelLabelImageDatastore(imdsVal,pxdsVal);
% set training options
options = trainingOptions('adam',...
'MaxEpochs',8,...
'MiniBatchSize',5,...
'Shuffle','every-epoch',...
'ValidationData',pximdsVal,...
'ValidationPatience',6,...
'InitialLearnRate',0.0005,...
'LearnRateSchedule','piecewise',...
'LearnRateDropPeriod',1,...
'LearnRateDropFactor',0.75,...
'ValidationFrequency',128,...
'ExecutionEnvironment','gpu',...
'Plots','training-progress',...
'OutputFcn', @(info)savetrainingplot(info,pth));
% design network
numclass = numel(classes);
imageFreq = tbl.PixelCount ./ tbl.ImagePixelCount;
classWeights = median(imageFreq) ./ imageFreq;
lgraph = deeplabv3plusLayers([sz sz 3],numclass,"resnet50");
pxLayer = pixelClassificationLayer('Name','labels','Classes',tbl.Name,'ClassWeights',classWeights);
lgraph = replaceLayer(lgraph,"classification",pxLayer);
% train
[net, info] = trainNetwork(pximdsTrain,lgraph,options);
save([pth,'net.mat'],'net','info');
end
% save a png of training progress when finished
function stop=savetrainingplot(info,pthSave)
stop=false;
if info.State=='done'
exportapp(findall(groot, 'Type', 'Figure'),[pthSave,'training_process_21.png'])
end
end
Since switching from MATLAB 2020a to 2021b, there is something strange happening with the validation loss. My training and validation accuracy are very similar, but my validation loss is orders of magnitude higher than training loss. Here I include a sample network trained with the code above using identical training & validation datasets in MATLAB 2020a vs 2021b to illustrate the problem.
Trained using MATLAB 2020a (training and validation loss/accuracy are similar):
Trained using MATLAB 2021b (validation loss is much higher than training loss while accuracies remain similar):
I appreciate any help!

답변 (1개)

yanqi liu
yanqi liu 2022년 2월 26일
yes,sir,may be use rgn('default') or rand('seed', 0) to make same run environment
  댓글 수: 1
Ashley
Ashley 2022년 2월 28일
Hi Yanqi,
I don't think that's the problem. I tried training in 2021b after presetting the random number generator like you suggeted (rng('default');rand('seed',0)) but the validation loss in 2021b is still acting strangely. My validation accuracy is still similar to training accuracy but the validation loss is orders of magnitude higher than training loss:
I don't think this is a discrepency between different initializations of the deep learning training but could be something different between how MATLAB 2020a and 2021b compute the loss function.
Thank you

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

제품


릴리스

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by