- 'sigmoid' – Use the sigmoid function σ(x)=(1+e−x)−1.
- 'hard-sigmoid' – Use the hard sigmoid function
Use relu function for lstmlayer
조회 수: 3 (최근 30일)
이전 댓글 표시
I would like to change the StateActivationFuction of lstmLayer to Relu fuction, but only 'tanh' and 'softsign' are supported in the deep learning tool box.
Is there any solutions for changing the activation function ,or the way to make customed lstmLayer with Relu as the StateActivation?
댓글 수: 0
답변 (1개)
slevin Lee
2022년 10월 21일
GateActivationFunction — Activation function to apply to the gates
no Relu fuction
╮(╯▽╰)╭
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
제품
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!