Use relu function for lstmlayer

조회 수: 3 (최근 30일)
Tomohiro Oka
Tomohiro Oka 2019년 7월 26일
답변: slevin Lee 2022년 10월 21일
I would like to change the StateActivationFuction of lstmLayer to Relu fuction, but only 'tanh' and 'softsign' are supported in the deep learning tool box.
Is there any solutions for changing the activation function ,or the way to make customed lstmLayer with Relu as the StateActivation?

답변 (1개)

slevin Lee
slevin Lee 2022년 10월 21일
GateActivationFunction — Activation function to apply to the gates
  • 'sigmoid' – Use the sigmoid function σ(x)=(1+ex)1.
  • 'hard-sigmoid' – Use the hard sigmoid function
no Relu fuction
╮(╯▽╰)╭

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

제품

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by