How to use Nadam optimizer in training deep neural networks
조회 수: 5 (최근 30일)
이전 댓글 표시
Training_Options = trainingOptions('sgdm', ...
'MiniBatchSize', 32, ...
'MaxEpochs', 50, ...
"InitialLearnRate", 1e-5, ...
'Shuffle', 'every-epoch', ...
'ValidationData', Resized_Validation_Data, ...
'ValidationFrequency', 40, ...
"ExecutionEnvironment","gpu",...
'Plots','training-progress', ...
'Verbose',false);
답변 (1개)
Nayan
2023년 4월 5일
Hi
I assume you want to use "adam" optimizer in place "sgdm". You need to simply replace the "sgdm" key with "adam" keyword.
options = trainingOptions("adam", ...
InitialLearnRate=3e-4, ...
SquaredGradientDecayFactor=0.99, ...
MaxEpochs=20, ...
MiniBatchSize=64, ...
Plots="training-progress")
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!