How to obtain word embedding vector for each word in the sentence using pre-trained BERT in MATLAB
조회 수: 8 (최근 30일)
이전 댓글 표시
Hello,
I have a question on How to obtain word embedding vector for each word in the sentence using pre-trained BERT in MATLAB. I successfully loaded bert and tokenized the words in the sentence, but I didn't find example code in MathWork website to get each word's embedding vector, like word2vec.
[net,tokenizer] = bert;
str = "Bidirectional Encoder Representations from Transformers";
words = wordTokenize(tokenizer,str)
% Then what...?
I would thank you if anyone can help this.
댓글 수: 0
답변 (1개)
Ganesh
2023년 12월 31일
I understand that you want to generate Word Embeddings for BERT Model using MATLAB. To achieve this, you can use the "encode()" function, implemented similar to your own implementation.
[net,tokenizer] = bert;
str = "Bidirectional Encoder Representations from Transformers";
words = encode(tokenizer,str);
In case of BERT, "embedding" and "encoding" can be used interchangeably. Further, you can use the "decode()" function to decode the "encodings".
Kindly refer to the documentation below to know more on these functions:
Kindly note that using "bert" model in MATLAB requires the Text Analytics Toolbox.
Hope this helps
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Modeling and Prediction에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!