onnxExport/onnxImport functions not working correctly

조회 수: 2 (최근 30일)
Luke Hubbard
Luke Hubbard 2023년 3월 27일
댓글: Sivylla Paraskevopoulou 2023년 4월 4일
I'm having trouble exporting to onnx. As a test case, I tried using https://www.mathworks.com/help/deeplearning/ug/classify-sequence-data-using-lstm-networks.html. I add the following section between the training and testing blocks and got the data format error as shown below.
exportONNXNetwork doesn't raise any issues, but importONNXNetwork does.
If I import with the following option set,
importONNXNetwork('onnxEx1.onnx','OutputsDataFormat','TBC')
it works fine. All is well, right? except that onnxruntime in other environments doesn't work. Trying to load the onnx file with onnxruntime in another environment (python, java) results in a 'ShapeInferenceError' complaining that the tensor must have rank 3.
It must be that exportONNX failing to define something that is required. Are there any workarounds known for this issue? I've tried this locally with R2022b and online which uses R2023a. Both give me the same issue.
  댓글 수: 1
Sivylla Paraskevopoulou
Sivylla Paraskevopoulou 2023년 4월 4일
Does it help to specify the BatchSize name-value argument when you use the exportONNXNetwork function?
Also, if you are planning to use the exported network in TensorFlow, with the exportNetworkToTensorFlow function you can export directly to TensorFlow without first converting to ONNX format.

댓글을 달려면 로그인하십시오.

답변 (0개)

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

제품


릴리스

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by