Potential data dimension mismatch in lstm layer with output mode as 'sequence'?

조회 수: 4 (최근 30일)
Liangwu Yan
Liangwu Yan 2023년 1월 11일
답변: Ben 2023년 3월 16일
From lstmLayer doc page (https://www.mathworks.com/help/deeplearning/ref/nnet.cnn.layer.lstmlayer.html), when the output mode is set as 'sequence'(default), states of every lstm cell (complete sequence) will be output.
When I am reading MATLAB example: Sequence-to-Sequence Regression Using Deep Learning (https://www.mathworks.com/help/deeplearning/ug/sequence-to-sequence-regression-using-deep-learning.html), I am confused at the data dimension between the lstmLayer() and the fullyConnectedlayer() as marked in red rectangle below
My question is, since the sequence length varies (show in the bar plots above), the number of identical lstm cells will be different (RNN definitions). Therefore, for different sequence length, the complete sequence output by lstmLayer() will be different. Following the lstmLayer is a fullyConnectedLayer, that means the size of the weights and bias will change. How could this happen? Moreover, suppose when predicting, a very long sequence comes in, then the complete sequence output by lstm would be extremely long which is not compatible with the weight and bias matrices?
Your answer would be greatly appreaciated, thank you! :).
From a newbie in RNN

답변 (1개)

Ben
Ben 2023년 3월 16일
The LSTM and Fully Connected Layer use the same weights and biases for all of the sequence elements. The LSTM works by using it's weights and biases to do 2 things - update the internal states HiddenState and CellState from the previous timestep, and compute the output at the current timestep. In particular it can compute these values using only the values at the current and previous timestep, so it doesn't need to maintain a history of states for every timestep in the sequence.

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by