Deep Learning Custom Layer learning parameters update

조회 수: 1 (최근 30일)
Mathieu Chêne
Mathieu Chêne 2022년 1월 12일
댓글: Mathieu Chêne 2022년 1월 14일
Hello,
I am working on a deep Learning project In which I try to classify data from a csv. I tryed to use a custom layer but when I train the network my Loss Function seems "constant" as if the weight is not updated.
Do you know what could be the reason of this behavior ?
I am sure of my dataset because when I use a fullyConnected Layer instead of my custom layer the training works perfectly and the testing gives me 100% accuracy.
I also give you the predict and the backward function from my custom layer where Weight is a learning parameter:
function Z = predict(layer, X)
% Z = predict(layer, X1, ..., Xn) forwards the input data X1,
% ..., Xn through the layer and outputs the result Z.
W = layer.Weights;
numel=size(X,2);
% Initialize output
Z = zeros(layer.OutputSize,numel,"single");
% Weighted addition
for k=1:numel
for j=1:layer.OutputSize
for i = 1:layer.InputSize
Z(j,k) = Z(j,k) + W(j,i)*X(i,k);
end
end
end
end
function [dLdX,dLdWeight]=backward(layer,X,~,dLdZ,~)
%Initialization
W=layer.Weights;
dLdWeight=zeros(size(W),"single");
dLdX=zeros(size(X),"single");
%Backward operation
for k=1:size(X,2)
for j=1:layer.OutputSize
for i=1:layer.InputSize
dLdWeight(j,i)=dLdWeight(j,i)+X(i,k)*dLdZ(j,k);
dLdX(i,k)=dLdX(i,k)+W(j,i)*dLdZ(j,k);
end
end
end
end
Thank you in advance for your futur help.
Mathieu

채택된 답변

yanqi liu
yanqi liu 2022년 1월 13일
yes,sir,may be add dropoutLayer or batchNormalizationLayer to model
  댓글 수: 1
Mathieu Chêne
Mathieu Chêne 2022년 1월 14일
Thank you for your answer.
I tryed it with a dropoutLayer and it seems to work. My accuracy increases and my loss decreases/
Thank you
Mathieu

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by