How to use a Leaky Relu/Softmax function in a hidden layer in a Feedforward Neural Network?
조회 수: 40 (최근 30일)
이전 댓글 표시
Hi.
I am using a feedforward neural network with an input, a hidden, and an output layer. I want to change the transfer function in the hidden layer to Leakyrelu but the usual command (given below for a poslin transfer function) is not working
net.layers{1}.transferFcn = 'poslin'; % this command is working for poslin
Please suggest the command for changing the transfer function in layer 1 to a leakyrelu. Kindly also suggest the command to change the output layer transfer function to a softmax in a feedforward neural network.
Thank you
Ihsan
댓글 수: 0
답변 (1개)
Abhishek Tiwari
2022년 7월 10일
Hi,
The following is a list of all relevant transfer functions:
% compet - Competitive transfer function.
% elliotsig - Elliot sigmoid transfer function.
% hardlim - Positive hard limit transfer function.
% hardlims - Symmetric hard limit transfer function.
% logsig - Logarithmic sigmoid transfer function.
% netinv - Inverse transfer function.
% poslin - Positive linear transfer function.
% purelin - Linear transfer function.
% radbas - Radial basis transfer function.
% radbasn - Radial basis normalized transfer function.
% satlin - Positive saturating linear transfer function.
% satlins - Symmetric saturating linear transfer function.
% softmax - Soft max transfer function.
% tansig - Symmetric sigmoid transfer function.
% tribas - Triangular basis transfer function.
net.layers{1}.transferFcn = 'poslin';
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!