LSTM (more input steps than hidden layers) How does Matlab handle this?

조회 수: 18 (최근 30일)
Hello,
I'm wondering what happens if I have more input steps of a sequence than hidden units of LSTM blocks.
For example below: What happens if I have 5 timeseries inputs but my network has only 2 hidenn LSTM blocks. How can the system learn?
Is it just feeding inside the first two timeseries inputs? What happens with the last three?
Code example:
numfeatures
layers = [ ...
sequenceInputLayer(numfeatures)
lstmLayer(2,'OutputMode','last')
fullyConnectedLayer(1)
regressionLayer];
Can someone help?
Thanks.

채택된 답변

Asvin Kumar
Asvin Kumar 2020년 5월 19일
You’ve shown in your diagram that an LSTM unrolls with each cell connected to the next (except the last cell, of course.) The connection between two cells carries forward a vector of data and the length of that vector of information is determined from the ‘NumHiddenUnits’ property while calling the lstmLayer. As mentioned here, the number of hidden units does not immediately have anything to do with the sequence length of the data. An LSTM unrolls to the length of the input signal as required. People choose an appropriate size for the Hidden Units sufficient to capture the information in the typical sequence length for their use case. Have a look at this example. There’s no mention of the sequence length whatsoever. This is a common misconception, hope that clarifies it.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

제품


릴리스

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by