Feedforward net - how to use LeakyReLU or scaled exponential linear unit for the hidden layers?

조회 수: 1 (최근 30일)
In a multi-layer shallow network using feedforwardnet, how to use different activation functions like Leaky ReLU or Scaled exponential linear unit in the hidden layers? The default function supported seem to be only tansig for the hidden layers.
  댓글 수: 2
Ihsan Ullah
Ihsan Ullah 2019년 4월 3일
Did you get an answer to your question? If you have sorted out this, would you please write the code in the comment section?
Thank you

댓글을 달려면 로그인하십시오.

답변 (0개)

제품


릴리스

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by