neural network infinite gradient problem, waiting for input, NaN

조회 수: 3 (최근 30일)
turok
turok 2012년 4월 1일
편집: Greg Heath 2016년 8월 14일
Hello,
i am trying to build an algorithm in which there will be many calls of neural network training and save some results to an array that will be dynamic processed.
In less than 10 iterations, i am getting the message from the neural network interface of gradient infinity and the program freezes and says waiting for input.
I ve tried to clear the variables before running it, lowered the Minimum performance gradient, changed the data set, tried different training methods but nothing works so far..
I would be glad to listen any suggestions as it has been driving me crazy for weeks..
The code is something like this it s for a hybrid ga-backpropagation
for counter=1:pop_size
run_back_propagation;
clear net;
clear tr;
population(counter,20) = performance;
end
i am using traingdm and 1 hidden layer.
  댓글 수: 3
Islam
Islam 2012년 5월 2일
Actually I seem to have run into the same problem, and from what I could find there are many others asking basically the same question. It's not exactly clear, but it seems to happen whenever one or more of the matrices used in the inner workings of the training procedure gets NaN's or Inf's, which could happen if the input is too large or the search starts rapidly diverging instead of converging (like what happens when the learning rate is too large). Just a guess though!
In all cases, the symptoms are that Matlab simply goes into debug mode and displays "waiting for input", which is a confusing combination. Also, when I tried stepping into the code, this problem seemed to be happening in some unavailable (non-Matlab or locked?) part of the code.
Another observation is that pressing F5 a few times can get past the "stuck" condition (but of course the training fails), and pressing Shift+F5 terminates the training immediately and returns to the command prompt.
Any help would be greatly appreciated.
Dhruv Thakkar
Dhruv Thakkar 2016년 8월 8일
편집: Greg Heath 2016년 8월 14일
i am using logistic regression backprop. algorithm to
train my network after some iteration i getting NaN is
it mean learning rate lambda is low and model is
overfitting?

댓글을 달려면 로그인하십시오.

답변 (0개)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by