User define function missbehave...

조회 수: 1 (최근 30일)
Shivang Patel
Shivang Patel 2015년 3월 12일
편집: Shivang Patel 2015년 3월 12일
Hi... I m working with BPNN, And as a reference code i m using this... " ANOOP ACADEMIA – BLOG's BPNN Code "
My problem is, in the for loop after function execute... value's are not updated and result are same... Where is the problem with it... Everything looks fine... I m confuse...where is my mistake... Code is...below
% Assigning the number of hidden neurons in hidden layer
m = 2;
errorValue_theshold = 100;
errorValue = errorValue_theshold + 1; % Only for initial...
delta_V = 0;
delta_W = 0;
[l,b] = size(data);
[n,a] = size(target);
V = rand(l,m); % Weight matrix from Input to Hidden
W = rand(m,n); % Weight matrix from Hidden to Output
count = 0;
itration = 100;
for count = 1: itration
[errorValue delta_V delta_W] = trainNeuralNet(data,target,V,W,delta_V,delta_W);
count = count + 1;
fprintf('Error : %f\n', errorValue);
fprintf('#itration : %d \t', count);
Error_Mat(count)=errorValue;
W = W + delta_W;
V = V + delta_V;
if errorValue < errorValue_theshold
fprintf('\n\nFinal Error : %f\n', errorValue);
fprintf('#totalItration : %d\n', count);
Error_Mat(count)=errorValue;
end
end
Function Code is : ----------------
*function [errorValue delta_V delta_W] = trainNeuralNet(Input, Output, V, W, delta_V, delta_W)*
Output_of_InputLayer = Input;
Input_of_HiddenLayer = V' * Output_of_InputLayer; % % netj
[m n] = size(Input_of_HiddenLayer);
Output_of_HiddenLayer = 1./(1+exp(-Input_of_HiddenLayer));
Input_of_OutputLayer = W'*Output_of_HiddenLayer;
clear m n;
[m n] = size(Input_of_OutputLayer);
Output_of_OutputLayer = 1./(1+exp(-Input_of_OutputLayer));
difference = Output - Output_of_OutputLayer;
square = difference.*difference;
errorValue = sqrt(sum(square(:)));
clear m n
[n a] = size(Output);
for i = 1 : n
for j = 1 : a
d(i,j) =(Output(i,j)-Output_of_OutputLayer(i,j))*Output_of_OutputLayer(i,j)*(1-Output_of_OutputLayer(i,j));
end
end
Y = Output_of_HiddenLayer * d';
if nargin == 4
delta_W=zeros(size(W));
delta_V=zeros(size(V));
end
etta=0.6; alpha=1;
delta_W= alpha.*delta_W + etta.*Y;
error = W*d;
clear m n
[m n] = size(error);
for i = 1 : m
for j = 1 :n
d_star(i,j)= error(i,j)*Output_of_HiddenLayer(i,j)*(1-Output_of_HiddenLayer(i,j));
end
end
X = Input * d_star';
delta_V = alpha * delta_V + etta * X;
end
In Advance... Thanks :)

답변 (0개)

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by