Why mini-batch size dosn't make any diferance in trining speed of my neural network?

조회 수: 2 (최근 30일)
J R
J R 2018년 7월 14일
댓글: Tran Vinh 2019년 12월 9일
I have created a neural network just to do some benchmarks and it seems that the training time is not effected by the MiniBatchSize option.
I have tried the following code with batch=32 and batch=1000.
I have 1491 data sequences .
the length is 10 for each sequence of the training data.
I am getting 60 sec for training.(for 500 epochs)
I have tried the same architecture in python using keras with tensorflow. my results there have changed significantly when I have changed the batch size. 17 sec for batch=1000 compared to 140 sec with batch=32.
I am also getting the same training time in matlab regardless of training algorithm ('sgdm' / 'rmsprop' / 'adam')
Why is it happening?
am I doing something wrong?
inputSize = 10;
lstm_neurons=100;
maxEpochs = 500;
batch=32;
layers = [ ...
sequenceInputLayer(finaal_inputSize)
fullyConnectedLayer(finaal_inputSize)
lstmLayer(lstm_neurons,'OutputMode','sequence')
lstmLayer(lstm_neurons,'OutputMode','sequence')
fullyConnectedLayer(1)
regressionLayer];
options = trainingOptions('adam', ...
'ExecutionEnvironment','gpu', ...
'MaxEpochs',maxEpochs, ...
'Verbose',0, ...
'InitialLearnRate' ,learningrate , ...
'MiniBatchSize',batch);
temp_nets= trainNetwork(x_train,y_train,layers,options);
  댓글 수: 1
Tran Vinh
Tran Vinh 2019년 12월 9일
Hi J R,
Did you find a solution yet? I also got this issue. If you or anyone found the solutions, pls share :D
Thank you

댓글을 달려면 로그인하십시오.

답변 (0개)

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

제품


릴리스

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by