How do I solve this IndexError in ONNX Model Predict block?

조회 수: 3 (최근 30일)
翼
2025년 1월 20일
댓글: 2025년 1월 24일
I created this kind of a sample Simulink model with ONNX Model Predict.
I want to run the simulation with my simple ONNX model that I created with Pytorch.
Also I set Input and output tab of the block parameter like the below images.
Input tab
Output tab
then I run the simulation however I got this error.
Despite the same count of Input to ONNX model, this happens...
Does this ring a bell? I could figure out where I am missing...
I'd appreciate it if you could give me your advice. How do I solve?
MATLAB System block 'untitled/ONNX Model Predict/ONNX Model Block' error when calling 'getOutputSizeImpl' method of
'nnet.pycoexblks.
Call to the Python model predict() function 'py.ONNXModelBlock.predict(...)' failed.
The Python error message is: == START OF PYTHON ERROR MESSAGE ==
Python error: IndexError: list index out of range
== END OF PYTHON ERROR MESSAGE ==.
Terminal width or dimension error.
' Output Terminal 1' in 'untitled/ONNX Model Predict/In7' is a 1-dimensional vector with 1 element.
My python code to create simple onnx model (Pytorch)
This is how I created my ONNX model. I just first wanted to try with easy way.
import torch
import torch.nn as nn
class EmptyModel(nn.Module):
def __init__(self):
super(EmptyModel, self).__init__()
# No trainable parameters, but add a linear layer to match Simulink requirements
self.linear = nn.Linear(7, 2, bias=False)
with torch.no_grad():
self.linear.weight.fill_(0.0)
def forward(self, x):
# Returns the first two elements of the input as is, without any computation
return x[:, :2]
model = EmptyModel()
dummy_input = torch.randn(1, 7, dtype=torch.float32)
torch.onnx.export(
model,
dummy_input,
"empty_model.onnx",
export_params=True,
opset_version=11,
do_constant_folding=True,
input_names=["input"],
output_names=["output"],
dynamic_axes={
"input": {0: "batch_size"},
"output": {0: "batch_size"},
},
)
Sorry that Japansene is included in my attached images...
I look forward to your answer.
Best,

답변 (2개)

Don Mathis
Don Mathis 2025년 1월 21일
편집: Don Mathis 2025년 1월 21일
Your PyTorch model (and ONNX model) actually takes only 1 input of shape [N,7], not 7 separate scalar inputs. The Simulink block passes your 7 inputs as 7 separate inputs to the ONNX model. To fix this, you could either concatenate your 7 inputs into a vector, or make a PyTorch model that takes 7 separate inputs.
  댓글 수: 2
翼
2025년 1월 22일
I made a PyTorch model that takes 7 separate inputs and 2 separate outputs.
This is how I solved it. Thank you!
I'm just curious about something,
but I suppose if I have a larger Pytorch mode and use it in "ONNX Model Predict" block, the performance of simulation for the whole Simulink model might be lagging.
For instance, you have a vehicle controll model like the below image.
Then you replace "VCU" model block to ONNX Model Predict block.
It's kind of like you change to a AI Surrogate Model ( by Pytorch or Tensorflow and stuff)
This is off topic but, what are your thoughts on that?
Do you have any ideas in order to speed up the simulation of ONNX model ?
Best,
Don Mathis
Don Mathis 2025년 1월 22일
The best you can do to speed up the ONNX block is to make sure your Python installation runs as fast as possible:
  • In MATLAB, use pyenv("ExecutionMode","InProcess").
  • If you have a GPU,
  • make sure your python environment has onnxruntime-gpu installed,
  • make sure it supports CUDAExecutionProvider,
  • make sure in python (outside MATLAB), you see a speedup when using GPU vs CPU
You could also try using the PyTorch Model Predict block directly, instead of converting to ONNX. There may not be a speed difference, but it may be easier to get GPU working in PyTorch.

댓글을 달려면 로그인하십시오.


Don Mathis
Don Mathis 2025년 1월 22일
Did you get a warning message before the error message that said something like this?
"Warning: Number of inputs specified on the Inputs tab must match the number of inputs specified in the Python model file. "
  댓글 수: 3
Don Mathis
Don Mathis 2025년 1월 24일
I expected the software to warn you that your Simulink model was passing too many inputs to your ONNX block.
翼
2025년 1월 24일
Thank you for telling me.
I didn't realize that. I wonder if it's better to figure something out....
Do you have any ideas on avoid passing too many inputs to your ONNX block ?
What would you do?

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Deep Learning with Simulink에 대해 자세히 알아보기

제품


릴리스

R2024b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by