Hidden Layer Activations in NN Toolbox

I'm looking for a non-manual way to compute the layer activations of an arbitrary neural network created with the Neural Network Toolbox.
Consider following example for detailed problem description:
[x,t] = simplefit_dataset; % toy data
net = fitnet(5); % initialize network
net.inputs{1}.processFcns = {}; % don't pre-process data
net.outputs{2}.processFcns = {}; % don't post-process data
net = train(net,x,t); % train network
The output of the network can be obtained using the sim function or manually:
testX = 0.5; % test value
testYnntbx = sim(net,testX) % automatic computation of network output
testYmanual = net.LW{2} ... % manual computation of network output
*(tansig(net.IW{1}*testX+net.b{1})) ...
+net.b{2}
The activations of the neurons in the hidden layer are:
testAmanual = tansig(net.IW{1}*testX+net.b{1})
I'm looking for a way to get the layer activations without manually specifying the equations, analogous to the sim function.

 채택된 답변

Greg Heath
Greg Heath 2013년 6월 8일

0 개 추천

You can create a 2nd net with no hidden layer. Next, make the output layer of the second net to be the same as the hidden layer of the first net.
net2 = fitnet([]);
will create a net with no hidden layer.
If the target matrix is not 5 dimensional, create a 5-fimensional target so that you can configure the correct topology. If t is 1-dimensional use
net2 = configure(net2,x, repmat(t,5,1));
Now you can replace the random initial weights of net2 with the hidden weights of net1.
I have not tried this. Therefore, there may be some details for you to work out.
Another approach might be to reproduce net1, net3 = net1, then remove the outer layer of net3. I can't see how to do this, but it may be possible.
Hope this helps.
Thank you for formally accepting my answer
Greg

댓글 수: 1

Ahmed
Ahmed 2013년 6월 10일
Replicating the reduced network is a viable workaround. However, every change in the original network potentially requires manual re-definition of the reduced network and is therefore a serious source of errors.

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

도움말 센터File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

질문:

2013년 6월 7일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by