if we can use learn rate drop factor with adam optimizer in DDPG or not ?

조회 수: 1 (최근 30일)
Maha Mosalam
Maha Mosalam 2021년 12월 25일
답변: Yash 2024년 12월 23일
if we can use learn rate drop factor with adam optimizer in DDPG or not ?
to decay to learnig rate during steps...if it possible...the options only provides OptimizerParameters , which not contain learn rate drop factor

답변 (1개)

Yash
Yash 2024년 12월 23일
The learning rate for training the actor or critic function approximator can be specified as a positive scalar by setting the LearnRate property within rlOptimizerOptions. Additionally, OptimizerParameters allows for the configuration of sub-parameters such as Momentum, Epsilon, GradientDecayFactor, and SquaredGradientDecayFactor. Specifically for the Adam solver, adjustments can be made to the Epsilon, GradientDecayFactor, and SquaredGradientDecayFactor parameters.
Adam (Adaptive Moment Estimation) optimizer is an adaptive learning rate optimization algorithm. It updates the learning rate based on several hyperparameters: the initial learning rate, the exponential decay rate for the first moment estimates, the exponential decay rate for the second moment estimates, and epsilon. By fine-tuning these hyperparameters, you can indirectly modify the learning rate.
For more detailed information on learning rate schedules for stochastic solvers such as sgdm, adam, and rmsprop, refer here: https://www.mathworks.com/help/deeplearning/ref/trainingoptions.html#bu59f0q-LearnRateSchedule

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by