Neural network training fails when target values are small. Mapminmax issue?

조회 수: 16 (최근 30일)
Roberto
Roberto 2016년 8월 28일
댓글: bah 2022년 10월 24일
When I try to train a network with very small targets the training stops at epoch 0 (i.e., does not begin at all) because the gradient is already too small. I understand that a very small target could imply a very small gradient but the mapminmax function is active and it should map the target in [-1,1] avoiding this kind of problems. So what's going on?
Here's some code:
First I define a really small sine wave:
in = [0:0.1:10];
out = sin(in)/1e10;
then I create and configure a network
net = fitnet([15]);
net = configure(net,in,out);
The mapminmax function seems to be active and properly configured:
net.outputs{1,2}.processSettings{1,2}
ans =
name: 'mapminmax'
xrows: 1
xmax: 9.9957e-11
xmin: -9.9992e-11
xrange: 1.9995e-10
yrows: 1
ymax: 1
ymin: -1
yrange: 2
no_change: 0
gain: 1.0003e+10
xoffset: -9.9992e-11
but the training fails (it stops at epoch 0):
[net,tr] = train(net,in,out);
tr.stop
ans =
Minimum gradient reached.
tr.num_epochs
ans =
0
The learning completely failed, this is the output of the net:
But if I manually use mapminmax everything works well
net = configure(net,in,mapminmax(out,-1,1));
[net,tr] = train(net,in,mapminmax(out,-1,1));
tr.stop
ans =
Minimum gradient reached.
tr.num_epochs
ans =
377
And the network actually learned the sine function:
Any ideas?
  댓글 수: 3
bah
bah 2022년 10월 24일
Did matlab solve this or is the issue still present ?

댓글을 달려면 로그인하십시오.

채택된 답변

Greg Heath
Greg Heath 2016년 9월 1일
You have to change the defaults for BOTH the MSE goal AND the Minimum gradient. They are on the scale of the UNNORMALIZED data. For simple problems I tend to use the average BIASED target variance estimate to get
MSEgoal = mean(var(target',1))/100
MinGrad = MSEgoal/100
On more serious problems I consider BOTH the UNBIASED mean target variance estimate for O-dimensional targets AND the loss of degree of freedom because the same data is used to BOTH estimate Nw unknown weights AND to estimate the performance:
MSEgoal =
0.01*max(0,Ndof)* mean(var(target',0)) / Ntrneq
where
Ntrneq = Ntrn*O % No of training equations
Ndof = Ntrneq - Nw % No of degrees of freedom
For details search both the NEWSGROUP and ANSWERS using
greg MSEgoal MinGrad
Hope this helps
Thank you for formally accepting my answer
Greg
  댓글 수: 4
Greg Heath
Greg Heath 2016년 9월 2일
You are right.
THIS IS A BUG.
I alerted MATLAB before and whatever fixed value they had for MSEgoal was changed to 0. I don't recall if they changed MinGrad or not.
Regardless, the use of my own MSEgoal and MinGrad was prompted because of the dissatisfaction with the MATLAB defaults.
By the way, if you are using NARNET or NARXNET the values should probably be scaled with 0.005 or 0.001 instead of 0.01 because closing the loop requires that openloop performance be exceptionally good.
Greg
Roberto
Roberto 2016년 9월 2일
Thank you very much for your help. Btw, in 2014a default msegoal and mingrad are 0 and 1e-7. I'll try your formula for these parameters. Thank you again!

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by