Text Analytics Toolbox Model for BERT-Large Network
Pretrained BERT-Large Network for MATLAB
다운로드 수: 135
업데이트 날짜:
2025/10/15
BERT-Large is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 24 self-attention layers and a hidden size of 1024.
To load a BERT-Large model, you can run the following code:
[net, tokenizer] = bert(Model="large");
MATLAB 릴리스 호환 정보
개발 환경:
R2023b
R2023b에서 R2026a까지의 릴리스와 호환
