필터 지우기
필터 지우기

How are the two layers "sequenceI​nputLayer(​num_channe​ls) & bilstmLaye​r(HU,Outpu​tMode="seq​uence")" connected to each other?

조회 수: 12 (최근 30일)
Hello, I would like to know how the connection between the sequenceInputLayer and a lstm or bilstmlayer was implemented.
Usually BiLSTM layers have separate weight matrices for each channel of the input.
In a typical BiLSTM network with a sequenceInputLayer and a bilstmLayer, each unit of the bilstmLayer would be connected to each channel of the sequenceInputLayer. This means that each unit of the bilstmLayer receives input from all channels of the sequenceInputLayer. For example, each unit receives the entire number vector per time step.
Is this correct? Please feel free to forward me the documentation on this topic.
Thank you very much and best regards
Chris

답변 (2개)

Debadipto
Debadipto 2024년 4월 23일
Yes, your understanding is generally correct about how a BiLSTM (Bidirectional Long Short-Term Memory) layer connects to a sequence input layer in a neural network architecture. When you use a sequence input layer followed by a BiLSTM layer, the input sequence is fed into both the forward and backward LSTM layers of the BiLSTM. Each unit in these LSTM layers processes the entire input sequence (or the entire set of features at each time step) but in opposite directions; the forward LSTM processes the sequence from start to end, while the backward LSTM processes it from end to start.
Regarding the documentation, the specifics of how these connections are implemented can vary depending on the software or framework you're using. Here are links to the documentation for popular frameworks that might help:
  댓글 수: 1
Christian Holz
Christian Holz 2024년 4월 26일
편집: Christian Holz 2024년 4월 26일
Thank you very much for your answer.
I agree with your description, but unfortunately I cannot find any reference in the Mathworks documentation. A corresponding description of the implementation would confirm our assumption.

댓글을 달려면 로그인하십시오.


Ieuan Evans
Ieuan Evans 2024년 4월 26일
Hi Christian,
For BiLSTM layers In MATLAB, for each of the input channels and for both the forward and backward parts of the layer, the weight matrices are concatenated into a single matrix. For example, the InputWeights property is a 8*NumHiddenUnits-by-InputSize matrix, where NumHiddenUnits and InputSize are the numbers of hidden units and input channels, respectively.
In this case, the input weight matrix is a concatenation of the eight input weight matrices for the components (gates) in the bidirectional LSTM layer. The eight matrices are concatenated vertically in this order:
  • Input gate (Forward)
  • Forget gate (Forward)
  • Cell candidate (Forward)
  • Output gate (Forward)
  • Input gate (Backward)
  • Forget gate (Backward)
  • Cell candidate (Backward)
  • Output gate (Backward)
For a diagram that shows how data flows through a BiLSTM layer, see https://uk.mathworks.com/help/deeplearning/ug/create-bilstm-function.html

카테고리

Help CenterFile Exchange에서 Build Deep Neural Networks에 대해 자세히 알아보기

제품


릴리스

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by