필터 지우기
필터 지우기

importNetworkFromONNX error opening LLM

조회 수: 9 (최근 30일)
joel brinton
joel brinton 2023년 12월 4일
댓글: joel brinton 2023년 12월 22일
I'm trying to import an ONNX model into Matlab. First, I exported a Large Language Model (LLM) to ONNX from Hugging Face using the Optimum library. Next, I imported it into Matlab with the Deep Learning Toolbox Converter. However, during import I get the following error:
>> net = importNetworkFromONNX("model.onnx")
Error using reshape
Number of elements must not change. Use [] as one of the size inputs to automatically calculate the appropriate size for that dimension.
Error in nnet.internal.cnn.onnx.fcn.GraphTranslation/getInitializerValue (line 129)
data = reshape(data, dimVec); % Apply MATLAB shape
...
Error in importNetworkFromONNX (line 77)
Network = nnet.internal.cnn.onnx.importNetworkFromONNX(modelfile, varargin{:});
I think the issue has to do with the lack of support for the external data file which is required for models greater than 2GB. ONNX models are built on protobuf which has a 2 GB limit. When the model is greater than 2GB, ONNX separates the model weights into an external raw data file, called model.onnx_data. I've noted that that importNetworkFromONNX() doesn't even attempt to open the associated onnx_data file before it aborts.
How can we get large ONNX model support into Matlab? I've scowered the comments already and, for some reason, no one else has yet run into this issue.
thanks!

답변 (1개)

Ashutosh Thakur
Ashutosh Thakur 2023년 12월 22일
Hi Joel,
I can understand that you are facing issues while importing ONNX model into MATLAB.
Here are the few suggestions which would help you:
  • If it is possible then try to split the large model into multiple smaller sub models such that it under the memory limit of 2GB, then try to run each model sequentially by passing the output of one to the input of other.
  • Also based on the error message it also seems that reshape is being used incorrectly used in the code, their might be the issue of particular ONNX network in its usage of reshape operation.
  • If the problem persists, I suggest you reach out to MathWorks Technical Support here: https://www.mathworks.com/support/contact_us.html
I hope this helps!
Thanks.
  댓글 수: 1
joel brinton
joel brinton 2023년 12월 22일
Thank you Ashutosh. I will try to see if I can export just one layer of the LLM and keep it under the 2GB limit. I hadn't considered that yet.
I think the error is because some off the Reshape inputs are not being loaded (due to the external data file not being supported). Here is the ONNX node description of the Reshape operation:
node {
output: "/model/Constant_25_output_0"
name: "/model/Constant_25"
op_type: "Constant"
attribute {
name: "value"
t {
dims: 1
data_type: 7
name: "/model/Constant_25_attr::value"
raw_data: "\377\377\377\377\377\377\377\377"
}
type: TENSOR
}
}
node {
input: "/model/Concat_3_output_0"
input: "/model/Constant_25_output_0"
output: "/model/Reshape_2_output_0"
name: "/model/Reshape_2"
op_type: "Reshape"
}
As you can see, the Reshape operation usage is standard. But the dimension vector input comes from raw_data which I'm assuming is not supported. Keeping the model under 2GB should prevent my ONNX exporter from using raw_data.
If importing the model one layer at a time doesn't work, I'll reach out for support.

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

제품


릴리스

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by