Out of memory issue while training a Neural Network (NN), array exceeds maximum array size preference using backpropJacobianStatic

조회 수: 8 (최근 30일)
Hello, this is my first time asking a question here, I will try to be brief and clear !
I am currently trying to train a NN of 2 hidden layers with 256 neurones both, in input and output i have a 22*size(trainSet) data set. This represent an amout of 77334 weigth + biais, this shouldn't be a problem to train a NN like that since I saw on some post people training much larger NN. But the issue is that when I call the train function somewhere inside the matlab code (in the backpropJacobianStatic) there is a matrix multiplication that create an array of size 77334*77334 (77334 the number of weight), and that takes all the memory creating an out of memory issue (picture bellow of the issue) :
My question is the following : is there a way not to create this matrix of size numberOfWeight*numberOfWeight that take all the memory during the training ? Because I don't really understand why we would need to store this array since we only need a 1*77334 size array to store the weight, no ?
Thank you in advance for your answers, and if you have any question or if I wasn't clear feel free to ask me for more information !

채택된 답변

Timothee Fichot
Timothee Fichot 2020년 7월 21일
편집: Timothee Fichot 2020년 7월 21일
If someone else has the same issu I found a way to solve it : just dont use the default training method which is the Levenberg-Marquardt backpropagation one. This method requires to comput the Jacobian matrix which in my case was too large. To avoid that you can use e.g. trainscg (or any other solver that does not require the jacobian) which does not require to comput the Jacobian matrix but only the gradient, which is much smaler !

추가 답변 (2개)

Greg Heath
Greg Heath 2020년 7월 17일
A single hidden layer is sufficient.
Hope this helps
Thank you for formally accepting my answer
Greg
  댓글 수: 1
Timothee Fichot
Timothee Fichot 2020년 7월 17일
Hello greg,
Indeed using only one layer will solve my issue, but I'm looking for a way to use 2 layers.
Thank you for your answer though !

댓글을 달려면 로그인하십시오.


Greg Heath
Greg Heath 2020년 7월 17일
A single hidden layer is sufficient.
Hope this helps
Thank you for formally accepting my answer
Greg

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by