Feedforward net - how to use LeakyReLU or scaled exponential linear unit for the hidden layers?
조회 수: 1 (최근 30일)
이전 댓글 표시
In a multi-layer shallow network using feedforwardnet, how to use different activation functions like Leaky ReLU or Scaled exponential linear unit in the hidden layers? The default function supported seem to be only tansig for the hidden layers.
댓글 수: 2
Ihsan Ullah
2019년 4월 3일
Did you get an answer to your question? If you have sorted out this, would you please write the code in the comment section?
Thank you
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Modeling and Prediction with NARX and Time-Delay Networks에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!