How to manually calculate a Neural Network output?

조회 수: 61 (최근 30일)
Jeff Chang
Jeff Chang 2018년 5월 2일
댓글: DarZim 2024년 1월 4일
Dear everyone,
I am exploring the Neural Network Toolbox and would like to manually calculate output by hand. I used one of the example provided by Matlab with the following code. Unfortunately, my output is incorrect. Does anyone know why? Thanks
%%below is the sample code from Matlab
[x, y] = crab_dataset;
size(x) % 6 x 200
size(y) % 2 x 200
setdemorandstream(491218382);
net = patternnet(10);
[net, tr_info] = train(net, x, y);
testX = x(:, tr_info.testInd);
testT = y(:, tr_info.testInd);
testY = net(testX);
testIndices = vec2ind(testY);
[error_rate, conf_mat] = confusion(testT, testY);
fprintf('Percentage Correct Classification : %f%%\n', 100*(1 - error_rate));
fprintf('Percentage Incorrect Classification : %f%%\n', 100*error_rate);
%%Manually calculate the output by hand
% nFeatures = 6
% nSamples = 200
% nHiddenNode = 10
% nClass = 2
% input layer => x (6x200)
% hidden layer => h = sigmoid(w1.x + b1)
% = (10x6)(6x200) + (10x1)
% = (10x200)
%
% output layer => yhat = w2.h + b2
% = (2x200)
w1 = net.IW{1}; % (10x6)
w2 = net.LW{2}; % (2x10)
b1 = net.b{1}; % (10x1)
b2 = net.b{2}; % (2x1)
h = sigmoid(w1*x + b1);
yhat = w2*h + b2;
[testY' yhat']
[vec2ind(testY)' vec2ind(yhat)']

채택된 답변

JESUS DAVID ARIZA ROYETH
JESUS DAVID ARIZA ROYETH 2018년 5월 2일
you missed several normalization parameters, here I leave the solution :
[x, y] = crab_dataset;
size(x) % 6 x 200
size(y) % 2 x 200
setdemorandstream(491218382);
net = patternnet(10);
[net, tr_info] = train(net, x, y);
xoffset=net.inputs{1}.processSettings{1}.xoffset;
gain=net.inputs{1}.processSettings{1}.gain;
ymin=net.inputs{1}.processSettings{1}.ymin;
w1 = net.IW{1}; % (10x6)
w2 = net.LW{2}; % (2x10)
b1 = net.b{1}; % (10x1)
b2 = net.b{2};
% Input 1
y1 = bsxfun(@times,bsxfun(@minus,x,xoffset),gain);
y1 = bsxfun(@plus,y1,ymin);
% Layer 1
a1 = 2 ./ (1 + exp(-2*(repmat(b1,1,size(x,2)) + w1*y1))) - 1;
% output
n=repmat(b2,1,size(x,2)) + w2*a1;
nmax = max(n,[],1);
n = bsxfun(@minus,n,nmax);
num = exp(n);
den = sum(num,1);
den(den == 0) = 1;
y2 = bsxfun(@rdivide,num,den);%y2==outputnet == net(x)
  댓글 수: 2
Jeff Chang
Jeff Chang 2018년 5월 2일
편집: Jeff Chang 2018년 10월 1일
Thank you very much. I realize that the 3 things that I missed were:
  1. I should normalize the input to [-1, 1]
  2. I should use the tanh activation (instead of the sigmoid activation) on the hidden layer
  3. I should apply softmax on the output layer
Sadaf Jabeen
Sadaf Jabeen 2022년 3월 14일
How can I compute the same with linear activation function and no bias values?

댓글을 달려면 로그인하십시오.

추가 답변 (3개)

Amir Qolami
Amir Qolami 2020년 4월 12일
편집: Amir Qolami 2020년 4월 12일

The In{i} and Out{i} are inputs and outputs of i(th) hidden(and also output) layer. There are two rescales before the input and after the output layer.

function output = NET(net,inputs)

    w = cellfun(@transpose,[net.IW{1},net.LW(2:size(net.LW,1)+1:end)],'UniformOutput',false);
    b = cellfun(@transpose,net.b','UniformOutput',false);
    tf = cellfun(@(x)x.transferFcn,net.layers','UniformOutput',false);
    %% mapminmax on inputs
    if strcmp(net.Inputs{1}.processFcns{:},'mapminmax')
        xoffset = net.Inputs{1}.processSettings{1}.xoffset;
        gain = net.Inputs{1}.processSettings{1}.gain;
        ymin = net.Inputs{1}.processSettings{1}.ymin;
        In0 = bsxfun(@plus,bsxfun(@times,bsxfun(@minus,inputs,xoffset),gain),ymin);
    else
        In0 = inputs;
    end
    %%
    In = cell(1,length(w));     Out = In;
    In{1} = In0'*w{1}+b{1};
    Out{1} = eval([tf{1},'(In{1})']);
    for i=2:length(w)
        In{i} = Out{i-1}*w{i}+b{i};
        Out{i} = eval([tf{i},'(In{',num2str(i),'})']);
    end
    %% reverse mapminmax on outputs
    if strcmp(net.Outputs{end}.processFcns{:},'mapminmax')
        gain = net.outputs{end}.processSettings{:}.gain;
        ymin = net.outputs{end}.processSettings{:}.ymin;
        xoffset = net.outputs{end}.processSettings{:}.xoffset;
        output = bsxfun(@plus,bsxfun(@rdivide,bsxfun(@minus,Out{end},ymin),gain),xoffset);
    else
        output = Out{end};
    end

end

  댓글 수: 2
Soumitra Sitole
Soumitra Sitole 2022년 4월 20일
편집: Soumitra Sitole 2022년 4월 20일
Thanks, this also worked for a relatively deep regression network
DarZim
DarZim 2024년 1월 4일
Why does this function return 3 values as output instead of 1?

댓글을 달려면 로그인하십시오.


Jeff Chang
Jeff Chang 2018년 5월 2일
편집: Jeff Chang 2018년 5월 2일
Beside, is it possible to use deepDreamImage() to visualize the hidden layer in this example?

Shounak Mitra
Shounak Mitra 2018년 10월 8일
Unfortunately, using deepDreamImage() is not possible in this case.

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by