Why peepholeLSTMLayer implemented in a tutorial is much slower than built-in lstmlayer?

조회 수: 6 (최근 30일)
Why this implementation of peepholeLSTMLayer https://au.mathworks.com/help/deeplearning/ug/define-custom-recurrent-deep-learning-layer.html is much slower than built-in lstmlayer?
What can be done to speed it up? For example, can it be compiled into a binary code?

답변 (1개)

Hiro Yoshino
Hiro Yoshino 2023년 8월 31일
I suppose that is because the implementation of interest is a custom model while the built-in LSTM is optimized for computation.
MATLAB has kept improving its performance over the years (see this). So I guess this is also the case with the buil-in capabilities in MATLAB.
As for speeding up, you may choose a CPU for computation of LSTM (See Tips).
You can also see this to speed up your custom trainings.
Hope these help you.
  댓글 수: 1
Artem Lensky
Artem Lensky 2023년 9월 1일
편집: Artem Lensky 2023년 9월 1일
Hi Hiro,
Thanks for the prompt reply. Yep, I train GRU/LSTM and PeepholeLSTM on a CPU. Peephole is not just slower, it's slower by a factor of 100 compared to standard LSTM. Luckily, this time I don't use custom training loops, it is trained by builtin train function. The model is extremly simple e.g. 1 layer with 8 PeepholeLSTM units. The dimensions of the input signals is 5 by (25k~30k).
1 'sequenceInputLayer' Sequence Input Sequence input with 5 dimensions
2 'rnn_1' peepholeLSTMLayer Peephole LSTM with 8 hidden units
3 'fc' Fully Connected 3 fully connected layer
4 'softmax' Softmax softmax
5 'classoutput' Classification Output crossentropyex
I ran the profiler (just 13 iteration of training) and see below what I've got. Any ideas how I can speed it up? Perhaps updating the tutorial code or compiling it to binary code e.g. mex. There must be something, it is just too slow. Thanks again!

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

제품


릴리스

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by