Strange neural network output

조회 수: 1 (최근 30일)
minomic
minomic 2015년 6월 22일
댓글: minomic 2015년 6월 23일
Hi, I am trying to use the Neural Network Toolbox but I have troubles in calculating the output of a network. I will try to explain my problem: I have defined a very simple ANN with one hidden layer and linear activation functions. So if I have an input x, then I expect the output of the hidden layer to be
h = w * x + b
where w are the weights and b the biases. Then I expect my output to be
o = w' * h + b'
where w' are the weights between the hidden layer and the output and b' the biases.
Now the problem is that if I do
o = net(x)
this doesn't happen. Here is my code:
net = feedforwardnet([layer1], 'traincgp');
net = configure(net, Dtrain, Dtrain);
net.trainParam.epochs = 0;
net.IW{1,1} = weights12;
net.LW{2,1} = my_weights;
net.b{1} = bias12;
for ii=1:size(net.layers, 1)
net.layers{ii}.transferFcn = 'purelin';
end;
net = train(net, Dtrain, Dtrain);
As you can see I am training for 0 epochs since this is just a test and I am also using Dtrain both as input and target since I am training an autoencoder. As I said, the problem is that if I calculate the output as I wrote before I get one result, while if I do
output = net(input)
I get another one. What should I do to have the same result?

채택된 답변

Greg Heath
Greg Heath 2015년 6월 23일
편집: Greg Heath 2015년 6월 23일
Just modify the following
close all, clear all, clc, tic
[ x, t ] = simplefit_dataset;
[ I N ] = size(x), [O N ] = size(t)
net = fitnet;
net.input.processFcns = { 'removeconstantrows' };
net.output.processFcns = { 'removeconstantrows' };
rng('default')
net = train(net,x,t);
NMSE1 = mse(t-net(x))/var(t) % 1.7057e-05
IW = net.IW{1,1} % [ 10 1 ]
b1 = net.b{1} % [ 10 1 ]
b2 = net.b{2} % [ 1 1 ]
LW = net.LW{2,1} % [ 1 10 ]
B1 = repmat(b1,I,N) % [ 10 94 ]
B2 = repmat(b2,O,N) % [1 94 ]
y = B2+LW*tanh(B1+IW*x); % [1 94 ]
NMSE2 = mse(t-y)/var(t) % 1.7057e-05
Hope this helps.
Thank you for formally accepting my answer
I will let you figure out how to handle
1. The default mapminmax normalization
2. Multiple inputs and outputs.
  댓글 수: 1
minomic
minomic 2015년 6월 23일
Thank you for your answer. In the meantime I managed to solve the problem by avoiding using 'feedforwardnet' but building the network from scratch with the function 'network'. Anyway I am going to accept this answer since I am sure it works as well.
Cheers,
minomic

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by