mean squared logarithmic error regression layer

조회 수: 9 (최근 30일)
VICTOR CATALA
VICTOR CATALA 2019년 6월 14일
댓글: Erdem AKAGUNDUZ 2020년 3월 16일
I'm trying to write a MSLE regression layer
Here is my code:
"
classdef msleRegressionLayer < nnet.layer.RegressionLayer
% Custom regression layer with mean-absolute-logarithmic-error loss.
methods
function layer = msleRegressionLayer(name)
% layer = maleRegressionLayer(name) creates a
% mean-absolute-logarithmic-error regression layer and specifies the layer
% name.
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = 'Mean squared logarithmic error';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the MSLE loss between
% the predictions Y and the training targets T.
% Calculate MSLE.
R = size(Y,1);
%meanAbsoluteError = sum(abs(Y-T),3)/R;
msle=sum((log10((Y+1)/(T+1))).^2,1)/R;
% Take mean over mini-batch.
N = size(Y,2);
loss = sum(msle,2)/N;
end
function dLdY = backwardLoss(layer, Y, T)
% Returns the derivatives of the MSLE loss with respect to the predictions Y
R = size(Y,1);
N = size(Y,2);
dLdY = 2*(log10(Y+1)-log10(T+1))./(N*R).*1./(Y+1).*ln(10);
end
end
end
"
In this case, size of x_train is 1024 x 500000 and size of Y_train is 1 x 500000.
Any help is wellcome
  댓글 수: 2
Greg Heath
Greg Heath 2019년 7월 20일
How do you prevent Y < -1 ?
Greg
VICTOR CATALA
VICTOR CATALA 2019년 7월 22일
I'm afraid I'm not preveting it in any way. But by inputs and targets are between +-1, and I'm not having problems with this issue.
Is there any way to make the msle regression layer hardy under any circumstances?
Thanks

댓글을 달려면 로그인하십시오.

채택된 답변

VICTOR CATALA
VICTOR CATALA 2019년 6월 27일
A new attempt with a new code. Can any body help me, please?
You can find below the error I get when checking it with checkLayer.
Thanks.
classdef msleRegressionLayer < nnet.layer.RegressionLayer
% Custom regression layer with mean-squared-logarithmic-error loss.
methods
function layer = msleRegressionLayer(name)
% layer = msleRegressionLayer(name) creates a
% mean-squared-logarithmic-error regression layer and specifies the layer
% name.
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = 'Mean squared logarithmic error';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the MSLE loss between
% the predictions Y and the training targets T.
% Calculate MSLE.
R = size(Y,3);
%meanAbsoluteError = sum(abs(Y-T),3)/R;
msle=sum((log10((Y+1)./(T+1))).^2,3)/R;
% Take mean over mini-batch.
N = size(Y,4);
loss = sum(msle)/N;
end
function dLdY = backwardLoss(layer, Y, T)
% Returns the derivatives of the MSLE loss with respect to the predictions Y
R = size(Y,3);
N = size(Y,4);
dLdY = 2/(N*R)*(log10(Y+1)-log10(T+1))./(Y+1)*2.3;
end
end
end
---------- the error -----------------
validInputSize = [1 1 64];
checkLayer(layer,validInputSize,'ObservationDimension',2);
Skipping GPU tests. No compatible GPU device found.
Running nnet.checklayer.OutputLayerTestCase
.......... ..
================================================================================
Verification failed in nnet.checklayer.OutputLayerTestCase/gradientsAreNumericallyCorrect.
----------------
Test Diagnostic:
----------------
The derivative 'dLdY' for 'backwardLoss' is inconsistent with the numerical gradient. Either 'dLdY' is incorrectly computed, the function is non-differentiable at some input points, or the error tolerance is too small.
---------------------
Framework Diagnostic:
---------------------
IsEqualTo failed.
--> NumericComparator failed.
--> The numeric values are not equal using "isequaln".
--> OrTolerance failed.
--> RelativeTolerance failed.
--> The error was not within relative tolerance.
--> AbsoluteTolerance failed.
--> The error was not within absolute tolerance.
--> Failure table (First 50 out of 128 failed indices):
Index Subscripts Actual Expected Error RelativeError RelativeTolerance AbsoluteTolerance
_____ __________ _____________________ _____________________ _____________________ ________________ _________________ _________________
1 (1,1,1) -0.00806319293483354 -0.00152252182827559 -0.00654067110655795 4.29594570342937 1e-06 1e-06
2 (1,2,1) 0.0173782998288026 0.00328143466576087 0.0140968651630418 4.29594570634949 1e-06 1e-06
3 (1,1,2) -0.000536075337220683 -0.000101223719209941 -0.000434851618010742 4.29594586530502 1e-06 1e-06
4 (1,2,2) -0.97350710635317 -0.183821201907031 -0.789685904446138 4.29594571384386 1e-06 1e-06
5 (1,1,3) 0.00314813776896237 0.000594442987858323 0.00255369478110404 4.29594567227477 1e-06 1e-06
6 (1,2,3) -0.0109473364182703 -0.00206711643882523 -0.00888021997944509 4.29594570129385 1e-06 1e-06
7 (1,1,4) 0.00149895300173575 0.000283037831116026 0.00121591517061973 4.29594576041422 1e-06 1e-06
8 (1,2,4) 0.189706697589761 0.0358211182337341 0.153885579356027 4.29594571425488 1e-06 1e-06
9 (1,1,5) -0.000759470262320229 -0.000143405975454442 -0.000616064286865788 4.29594572271854 1e-06 1e-06
10 (1,2,5) 0.0115914058384099 0.00218873199532239 0.00940267384308754 4.29594571796925 1e-06 1e-06
11 (1,1,6) -0.00429538447148508 -0.000811070335832431 -0.00348431413565265 4.29594571730523 1e-06 1e-06
12 (1,2,6) -0.237408804435658 -0.0448284059673098 -0.192580398468348 4.29594571372409 1e-06 1e-06
13 (1,1,7) 0.00382155634732165 0.000721600361561078 0.00309995598576057 4.29594572133288 1e-06 1e-06
14 (1,2,7) -0.0275850520943908 -0.00520871126418585 -0.022376340830205 4.29594571387756 1e-06 1e-06
15 (1,1,8) 0.00292153684611514 0.000551655363172808 0.00236988148294233 4.29594569571865 1e-06 1e-06
16 (1,2,8) 0.0740510826741266 0.0139825985159458 0.0600684841581808 4.29594571350087 1e-06 1e-06
17 (1,1,9) -0.0104108932409724 -0.00196582326057465 -0.00844506998039772 4.29594569856146 1e-06 1e-06
18 (1,2,9) 0.0132085385178397 0.00249408495291653 0.0107144535649232 4.29594571443684 1e-06 1e-06
19 (1,1,10) -0.00229277978949077 -0.000432931132650673 -0.0018598486568401 4.29594574419481 1e-06 1e-06
20 (1,2,10) -0.515298873789789 -0.0973006336639219 -0.417998240125867 4.29594571366966 1e-06 1e-06
21 (1,1,11) 0.00591756616472338 0.0011173766651645 0.00480018949955888 4.29594571750985 1e-06 1e-06
22 (1,2,11) -0.00225352021569196 -0.000425517999275717 -0.00182800221641624 4.29594569331431 1e-06 1e-06
23 (1,1,12) 0.00286539916887221 0.000541055238305354 0.00232434393056686 4.29594571128627 1e-06 1e-06
24 (1,2,12) 0.0938210921435482 0.0177156446092186 0.0761054475343297 4.29594571426022 1e-06 1e-06
25 (1,1,13) -0.00379892004189501 -0.000717326092262283 -0.00308159394963273 4.29594571126513 1e-06 1e-06
26 (1,2,13) 0.00664367928439937 0.00125448402224708 0.00538919526215228 4.29594571678875 1e-06 1e-06
27 (1,1,14) -0.00344246347234258 -0.000650018646076931 -0.00279244482626565 4.29594572881707 1e-06 1e-06
28 (1,2,14) -0.0302486510493165 -0.0057116618414583 -0.0245369892078582 4.29594571403994 1e-06 1e-06
29 (1,1,15) 0.0048216065051209 0.000910433520793594 0.00391117298432731 4.29594571706682 1e-06 1e-06
30 (1,2,15) -0.0305915745986099 -0.00577641393087521 -0.0248151606677347 4.29594571384445 1e-06 1e-06
31 (1,1,16) 0.00160936632297574 0.000303886484857304 0.00130547983811844 4.29594570068312 1e-06 1e-06
32 (1,2,16) 0.059404660114581 0.0112170069918458 0.0481876531227352 4.29594571508828 1e-06 1e-06
33 (1,1,17) -0.00915464080016408 -0.00172861303325306 -0.00742602776691102 4.2959457229916 1e-06 1e-06
34 (1,2,17) 0.015670205358192 0.00295890596029161 0.0127112993979004 4.2959457206432 1e-06 1e-06
35 (1,1,18) -0.000875767637440754 -0.000165365679492272 -0.000710401957948483 4.29594556820772 1e-06 1e-06
36 (1,2,18) -11.6539102635645 -2.20053431307148 -9.453375950493 4.29594571388351 1e-06 1e-06
37 (1,1,19) 0.00491728204875702 0.000928499326045815 0.0039887827227112 4.29594573826797 1e-06 1e-06
38 (1,2,19) -0.013789474064208 -0.0026037793424202 -0.0111856947217878 4.29594571995902 1e-06 1e-06
39 (1,1,20) 0.00186390528226548 0.000351949469505017 0.00151195581276046 4.29594570745307 1e-06 1e-06
40 (1,2,20) 0.270225796929937 0.051025031512098 0.219200765417839 4.29594571373987 1e-06 1e-06
41 (1,1,21) -0.00117047221113473 -0.00022101287359287 -0.000949459337541863 4.29594585196368 1e-06 1e-06
42 (1,2,21) 0.0138322916427576 0.00261186431676503 0.0112204273259925 4.29594571738312 1e-06 1e-06
43 (1,1,22) -0.00483892489696674 -0.000913703645467027 -0.00392522125149971 4.29594570512345 1e-06 1e-06
44 (1,2,22) -0.128282362013619 -0.0242227486727094 -0.10405961334091 4.2959457139622 1e-06 1e-06
45 (1,1,23) 0.00265318416415 0.000500984018712759 0.00215220014543724 4.2959457089413 1e-06 1e-06
46 (1,2,23) -0.022490715456664 -0.00424677983447767 -0.0182439356221864 4.29594571257783 1e-06 1e-06
47 (1,1,24) 0.00345107124244956 0.000651643999021533 0.00279942724342803 4.29594571212422 1e-06 1e-06
48 (1,2,24) 0.0665220779961111 0.0125609440870582 0.0539611339090529 4.29594571355908 1e-06 1e-06
49 (1,1,25) -0.00867601143506371 -0.00163823647275124 -0.00703777496231247 4.29594571929735 1e-06 1e-06
50 (1,2,25) 0.00943054716364061 0.00178071069472335 0.00764983646891726 4.29594570953353 1e-06 1e-06
Actual Value:
1x2x64 double
Expected Value:
1x2x64 double
------------------
Stack Information:
------------------
In C:\Program Files\MATLAB\R2019a\toolbox\nnet\cnn\+nnet\+checklayer\OutputLayerTestCase.m (OutputLayerTestCase.gradientsAreNumericallyCorrect) at 165
================================================================================
.
Done nnet.checklayer.OutputLayerTestCase
__________
Failure Summary:
Name Failed Incomplete Reason(s)
=================================================================================================================
nnet.checklayer.OutputLayerTestCase/gradientsAreNumericallyCorrect X Failed by verification.
Test Summary:
12 Passed, 1 Failed, 0 Incomplete, 4 Skipped.
Time elapsed: 3.8134 seconds.
>>
  댓글 수: 3
VICTOR CATALA
VICTOR CATALA 2019년 7월 8일
Thanks Joss. It works now.
Regards
Erdem AKAGUNDUZ
Erdem AKAGUNDUZ 2020년 3월 16일
Hello Victor,
Nice job with MSLE Loss layer, and thanks.
I have a question actually, and I hope you can help me.
Why do we divide by the mini-batch size (N = size(Y,4)) in the backwardLoss function?
I know the examples in MATLAB help also does this. but I don't undestand it, so I am looking for an answer.
For example:
if the output of the network (that goes in the loss function) is 224x224x1xN
then we expect the size of dLdY to be the same as 224x224x1xN.
So why do we divide by this gradient by N. We did NOT sum over the gradients in the mini-batch dimension. So why average along that dimension?
Thank you very much.

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by