Error exporting trained neural network model using ONNX to onnx format

조회 수: 3 (최근 30일)
Faraz Amjad
Faraz Amjad 2021년 2월 1일
답변: Shashank Gupta 2021년 2월 4일
Hello everyone,
I have trained a model using the Deep Learning Toolbox in MATLAB 2020b. From the toolbox, I exported the following network to the workspace:
net =
Neural Network
name: 'Function Fitting Neural Network'
userdata: (your custom info)
dimensions:
numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 0
numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 157
sampleTime: 1
connections:
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
subobjects:
input: Equivalent to inputs{1}
output: Equivalent to outputs{2}
inputs: {1x1 cell array of 1 input}
layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}
functions:
adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: 'dividerand'
divideParam: .trainRatio, .valRatio, .testRatio
divideMode: 'sample'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization
plotFcns: {'plotperform', 'plottrainstate', 'ploterrhist',
'plotregression', 'plotfit'}
plotParams: {1x5 cell array of 5 params}
trainFcn: 'trainbr'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
methods:
adapt: Learn while in continuous use
configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs
evaluate: outputs = net(inputs)
To export the above network to ONNX format, I am using the following add-on:
However, when I try to export the model, it throws the following error:
Error using nnet.internal.cnn.onnx.exportONNXNetwork>iValidateNetwork (line 62)
First argument must be a SeriesNetwork, DAGNetwork, dlnetwork, or layerGraph.
Error in nnet.internal.cnn.onnx.exportONNXNetwork>iValidateInputs (line 53)
NNTNetwork = iValidateNetwork(NNTNetwork);
Error in nnet.internal.cnn.onnx.exportONNXNetwork (line 29)
[NNTNetwork, Filename, NetworkName, OpsetVersion] = iValidateInputs(NNTNetwork, Filename, defaultOpset, varargin{:});
Error in exportONNXNetwork (line 40)
nnet.internal.cnn.onnx.exportONNXNetwork(Network, filename, varargin{:});
It seems that the add-on does not recognize the format of the network, even though the network should be a series network since it is a simple multi-layer perceptron. Is there any workaround this? I do not understand how else to export the model otherwise. I am trying to export it to ONNX format so that it can be used in Python.
Any help would be much appreciated,
Thank You.

답변 (1개)

Shashank Gupta
Shashank Gupta 2021년 2월 4일
Hi faraz,
you are using shallow neural network and this network does not belongs to any of the above network datatype and thus you are getting the error. MATLAB as of now does not have a straightforward way to convert shallow neural networks to ONNX. exportONNXNetwork function only applies for deep networks APIs. But there are some workaround I can suggest. My first suggestion would be to re-create the shallow neural network using trainNetwork workflow, then give that network to exportONNXNetwork function. Secondly, if you are looking for deployment purpose only then you there are some other reference you can use too. It is not specific to ONNX. Link is here.
I hope this helps you.
Cheers.

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by