Complete transformer model (Encoder + Decoder + Interconections)

조회 수: 34 (최근 30일)
WIll Serrano
WIll Serrano 2024년 8월 2일
댓글: WIll Serrano 2024년 10월 5일
Hello
I am wondering if there is already a Matlab keyboard warrior that has coded (on Matlab) a full transformer model:
  1. Inputs: Input Embedding + Positional Encoding
  2. Encoder: Multihead Attention + Add & Normalisation + Feedforward + Add & Normatisation
  3. Outputs: Output Embedding + Positional Encoding
  4. Decoder: Masked Multihead Attention + Add & Normalisation + Multihead Attention + Add & Normalisation + Feedforward + Add & Normatisation
  5. Final: Linear and Softmax.
Including all the interconnections between them.
Thank you
Will

답변 (1개)

Yash Sharma
Yash Sharma 2024년 8월 5일
Hi Will,
You can take a look at the following file exchange submission.
  댓글 수: 2
WIll Serrano
WIll Serrano 2024년 8월 7일
Hello Yash
Thank you for your answer.
I read that one, it is based on a pre-trained transformer and it does not directly represent the transformer components. As well it provides the same functionality as a normal LSTM for text classification.
It is acknowledged transformers with attention are somehow superior to Deep Learning based on LSTM, however, I have yet to prove it myself.
Thank you
Will
WIll Serrano
WIll Serrano 2024년 10월 5일
As it seems nobody has answered, I have cracked the code myself.

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Specialized Power Systems에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by