What does SequenceLength property in the training options for an LSTM layer really mean and why is it there?
조회 수: 18 (최근 30일)
이전 댓글 표시
Hello,
I am trying to understand the different terminologies used for creating an lstmLayer in MATLAB. I finally understood that numHiddenUnits parameter is the number of LSTM "cells" and the higher it is, the "longer" the network is. So, as far as I am concerned, for e.g. numHiddenUnits=100, the network always takes 100 time steps of the data for each training iteration. So according to my logic, I cannot find any use for SequenceLength.
댓글 수: 0
채택된 답변
Ieuan Evans
2018년 9월 27일
Hi,
Indeed, the software "unrolls" the layer to have length given by 'SequenceLength'. The network is stateful, so it also updates the network state between split sequences.
추가 답변 (2개)
Ieuan Evans
2018년 9월 25일
편집: Ieuan Evans
2018년 9월 25일
Hi,
The number of hidden units corresponds to the amount of information remembered between time steps (the hidden state). The hidden state can contain information from all previous time steps, regardless of the sequence length. If the number of hidden units is too large, then the layer might overfit to the training data.
The hidden state does not limit the number of time steps are processed in an iteration. To split your sequences into smaller sequences for training, use the 'SequenceLength' option in trainingOptions.
If you specify the sequence length as a positive integer, then the software pads the sequences in each mini-batch to have the same length as the longest sequence, then split into smaller sequences of the specified length. If splitting occurs, then the function creates extra mini-batches.
You can use this option if the full sequences do not fit in memory. Alternatively, try reducing the number of sequences per mini-batch by setting the 'MiniBatchSize' option to a lower value.
If you specify the sequence length as a positive integer, then the software processes the smaller sequences in consecutive iterations. The network also updates the network state between the split sequences.
MB Sylvest
2019년 3월 11일
Dear Ieuan Evans
It is not fully clear to us how the implementation of LSTM is i matlab. LSTM is well established in Keras. Hence for us to use Matlab, we really need some more information. Could you please clarify the following:. I think is is better done with an example: https://uk.mathworks.com/help/deeplearning/examples/time-series-forecasting-using-deep-learning.html
Could you clarify:
1) what is the apparent minibatch size in above example as changing the minibatch size has no effect? I just guess that there is only one mini batch that is equal to the full size of the sequence (longest)?
2) in the above example, is this a stateful implmentation? and is it always stateful implementations?
Kind regards Mads
댓글 수: 1
Ieuan Evans
2019년 3월 14일
Hi,
In the forecasting example, there is only one observation (a single time series) so the mini-batch size setting has no effect. If the total number of observations is less than or equal to the mini-batch size, then the network processes all the observations in a single iteration.
When specifying the SequenceLength option as an integer, the network is stateful. The network does not reset the state when mini-batches have been created by splitting the observations. The network resets the state when the mini-batch contains a new set of observations.
When the SequenceLength option is 'shortest' or 'longest', then the network is stateless. The network resets the state when the mini-batch contains a new set of observations.
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!