Hidden Layer Activations in NN Toolbox
조회 수: 3 (최근 30일)
이전 댓글 표시
I'm looking for a non-manual way to compute the layer activations of an arbitrary neural network created with the Neural Network Toolbox.
Consider following example for detailed problem description:
[x,t] = simplefit_dataset; % toy data
net = fitnet(5); % initialize network
net.inputs{1}.processFcns = {}; % don't pre-process data
net.outputs{2}.processFcns = {}; % don't post-process data
net = train(net,x,t); % train network
The output of the network can be obtained using the sim function or manually:
testX = 0.5; % test value
testYnntbx = sim(net,testX) % automatic computation of network output
testYmanual = net.LW{2} ... % manual computation of network output
*(tansig(net.IW{1}*testX+net.b{1})) ...
+net.b{2}
The activations of the neurons in the hidden layer are:
testAmanual = tansig(net.IW{1}*testX+net.b{1})
I'm looking for a way to get the layer activations without manually specifying the equations, analogous to the sim function.
댓글 수: 0
채택된 답변
Greg Heath
2013년 6월 8일
You can create a 2nd net with no hidden layer. Next, make the output layer of the second net to be the same as the hidden layer of the first net.
net2 = fitnet([]);
will create a net with no hidden layer.
If the target matrix is not 5 dimensional, create a 5-fimensional target so that you can configure the correct topology. If t is 1-dimensional use
net2 = configure(net2,x, repmat(t,5,1));
Now you can replace the random initial weights of net2 with the hidden weights of net1.
I have not tried this. Therefore, there may be some details for you to work out.
Another approach might be to reproduce net1, net3 = net1, then remove the outer layer of net3. I can't see how to do this, but it may be possible.
Hope this helps.
Thank you for formally accepting my answer
Greg
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!