Unable to specify BatchSize in exportONNXNetwork
조회 수: 3 (최근 30일)
이전 댓글 표시
Hello!
I want to export a simple neural network from Matlab to the .ONNX format. Therefore, I use the exportONNXNetwork function. In the documentation, there is the name-value argument "BatchSize", which allows to fix the batch size of the network, which I need to be set to 1.
exportONNXNetwork(net, 'network.onnx', "BatchSize", 1)
Sadly, I cannot set the name-value argument as described in the documentation (https://www.mathworks.com/help/deeplearning/ref/exportonnxnetwork.html); it is not defined inside the exportONNXNetwork function anywhere and finally throws an error in iValidateInputs:
Error using nnet.internal.cnn.onnx.exportONNXNetwork>iValidateInputs (line 49)
'BatchSize' is not a recognized parameter. For a list of valid name-value pair arguments, see the documentation for this function.
Error in nnet.internal.cnn.onnx.exportONNXNetwork (line 29)
[NNTNetwork, Filename, NetworkName, OpsetVersion] = iValidateInputs(NNTNetwork, Filename, defaultOpset, varargin{:});
Error in exportONNXNetwork (line 38)
nnet.internal.cnn.onnx.exportONNXNetwork(Network, filename, varargin{:});
Error in eval_mlp (line 76)
exportONNXNetwork(net, 'network.onnx', "BatchSize", 1)
How can I export a neural network from Matlab to .ONNX format with a fixed batch size?
댓글 수: 1
Sivylla Paraskevopoulou
2022년 5월 9일
The name-value argument BatchSize was added to the exportONNXNetwork function in R2022a. Which MATLAB version are you using?
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!