Why MATLAB does not provide activations function for Recurrent Neural Networks

조회 수: 5 (최근 30일)
Maad Ebrahim
Maad Ebrahim 2018년 6월 11일
편집: Stuart Whipp 2018년 11월 11일
MATLAB has a function called "activations" that produces the activations of a specific layer in the SeriesNetwork: https://www.mathworks.com/help/nnet/ref/activations.html
However, it does not work with RNN sequence networks. So, Is there a way to have the activations of a specific layer of an RNN/LSTM network.
Thanks in Advanced
  댓글 수: 1
Stuart Whipp
Stuart Whipp 2018년 11월 10일
편집: Stuart Whipp 2018년 11월 11일
Can confirm this works with ReLU, LSTM & BiLSTM (also using custom regression output). As a trivial solution, why not slice your network at the desired layer - and then run predict command? There's no weight update so should be identical to extracting activations from a given layer. Copy and paste this to a .m file, hope it helps :)
function [activations] = myActivations(net,data,layer_no)
if ~isnumeric(layer_no)
warning('layer_no (3rd argument) should be an integer, representing index of layer activating')
elseif or(layer_no>(size(net.Layers,1)-1),layer_no<2)
warning(strcat('layer_no exceeds network size, select a number between 2 and ',num2str((size(net.Layers,1)-1))))
end
if string(class(net.Layers((size(net.Layers,1)))))=="nnet.cnn.layer.RegressionOutputLayer"
net_new=net.Layers([ 1:layer_no (size(net.Layers,1)) ]);
% pretty straightforward when a regression network
net_new=SeriesNetwork(net_new);
elseif layer_no==(size(net.Layers,1))-1
warning(strcat('layer_no exceeds network size, select a number between 2 and ',num2str((size(net.Layers,1)-2)),'. For Softmax, use multiple output arguments with =classify()'))
else
% We're going to have to cut off classificationOutput and replace with regression to convert layers back to a 'valid system' for predict command
net_cut=net.Layers(1:layer_no);
layers = [ ...
net_cut
regressionLayer]; % has to be a regression layer in order to be a 'valid system'
net_new=SeriesNetwork(layers);
end
activations=predict(net_new,data);
I've also come across this, though you'd need the activations from preceding layer to view LSTM layer's outputs as I understand (unless its the first layer in your network, in which case the net's inputs). https://uk.mathworks.com/matlabcentral/fileexchange/64863-getlstmoutput
My method above should work by defining the neural net's inputs regardless of the layer's output you want... as it will feedforward through all layers (besides the ones after layer_no which are discarded).

댓글을 달려면 로그인하십시오.

답변 (0개)

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

제품


릴리스

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by