How do I export a neural network from MATLAB?

조회 수: 167 (최근 30일)
MathWorks Support Team
MathWorks Support Team 2017년 2월 15일
편집: Jon Cherrie 2023년 12월 21일
I have a neural network which I trained using MATLAB. I want to export the network so I can use it with other frameworks, for example Caffe. How do I do that?

채택된 답변

MathWorks Support Team
MathWorks Support Team 2023년 12월 8일
편집: MathWorks Support Team 2023년 12월 20일
You can export a Deep Learning Toolbox network or layer graph to TensorFlow and ONNX using the "exportNetworkToTensorFlow" and "exportONNXNetwork" functions, respectively.  For more information about exporting networks to external deep learning platforms, please see the following documentation page:
Alternatively, you could export via the MATLAB Compiler SDK.
Using the MATLAB Compiler SDK, you can save the trained network as a MAT file, and write a MATLAB function that loads the network from the file, performs the desired computation, and returns the network's output.
You can then compile your MATLAB function into a shared library to be used in your C/C++, .NET, Java, or Python project.
You can find more information about MATLAB Compiler SDK in the following link:
Furthermore, the objects that MATLAB uses to represent neural networks are transparent, and you can therefore access all the information that describes your trained network.
For example, you will get an object of type "SeriesNetwork", which is a trained Convolutional Neural Network. You can then see the weights and biases of the trained network:
convnet.Layers(2).Weights
convnet.Layers(2).Bias
Then, using for example Caffe's MATLAB interface, you should be able to save a Convolutional Neural Network as a Caffe model. The code for the MATLAB interface is in the following link:
and includes a classification demo that shows you how to use the interface.
Please note that the above code is not developed or supported by MathWorks Technical Support. If you have any questions about how to use the code, please contact the project's developers.

추가 답변 (2개)

Maria Duarte Rosa
Maria Duarte Rosa 2018년 6월 25일
편집: Jon Cherrie 2023년 12월 21일
The exportONNXNetwork function in Deep Learning Toolbox Converter for ONNX Model Format allows one to export a trained deep learning network to the ONNX™ (Open Neural Network Exchange) model format. The ONNX model can then be imported into other deep learning frameworks that support ONNX model import.
To use the network with TensorFlow, use the exportNetworkToTensorFlow function that is part of the Deep Learning Toolbox Converter for TensorFlow Models.
  댓글 수: 1
michael scheinfeild
michael scheinfeild 2018년 8월 6일
still i have no success to import it to c++ from onnx there are many issues of compilation

댓글을 달려면 로그인하십시오.


michael scheinfeild
michael scheinfeild 2019년 4월 14일
after testing onnx i found that the output of convolutions is not the same as in matlab .

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by