Adam Optimizer with feedforward nueral networks

조회 수: 15 (최근 30일)
Manos Kav
Manos Kav 2018년 4월 30일
댓글: Bob 2022년 11월 18일
Hello, is there any way to use Adam optimizer to train a neural network with the "train" fucntion? Or a way to use this implementation ( https://www.mathworks.com/matlabcentral/fileexchange/61616-adam-stochastic-gradient-descent-optimization ) to train the network?
Thanks in advance.
  댓글 수: 2
Abdelwahab Afifi
Abdelwahab Afifi 2020년 6월 14일
Have you get the answer ?
Bob
Bob 2022년 11월 18일
did anyone of you guys got the answer?

댓글을 달려면 로그인하십시오.

답변 (1개)

Hrishikesh Borate
Hrishikesh Borate 2020년 6월 19일
Hi,
It’s my understanding that you want to use Adam optimizer to train a neural network. This can be done using trainNetwork function, and setting the appropriate Training Options.
For eg.,
[XTrain,~,YTrain] = digitTrain4DArrayData;
layers = [ ...
imageInputLayer([28 28 1])
convolution2dLayer(12,25)
reluLayer
fullyConnectedLayer(1)
regressionLayer];
options = trainingOptions('adam', ...
'InitialLearnRate',0.001, ...
'GradientThreshold',1, ...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,layers,options);
[XTest,~,YTest] = digitTest4DArrayData;
YPred = predict(net,XTest);
rmse = sqrt(mean((YTest - YPred).^2))
For more information, refer to trainNetwork.
  댓글 수: 1
Abdelwahab Afifi
Abdelwahab Afifi 2020년 6월 19일
'trainNetwork' is used for Deep learning neural network. But I think he wanna use 'Adam optimizer' to train shallow neural network using 'train' function.

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by