MATLAB to OpenVINO (Intel-Inteference)

버전 1.0.0 (1.84 MB) 작성자: Kevin Chng
Deploy and optimise your trained model to Intel Processor
다운로드 수: 206
업데이트 날짜: 2019/2/18

라이선스 보기

Overview :

If you train your deep learning network in MATLAB, you may use OpenVINO to accelerate your solutions in Intel®-based accelerators (CPUs, GPUs, FPGAs, and VPUs) . However, this script don't compare OpenVINO and MATLAB's deployment Option (MATLAB Coder, HDL coder), instead, it will only give you the rough idea how to complete it (MATLAB>OpenVINO) in technical perspective.

Refers to the the link below to understand OpenVINO:
https://software.intel.com/en-us/openvino-toolkit

Highlights :
Deep Learning and Prediction
How to export deep learning model to ONNX format
How to deploy a simple classification application in OpenvinoR4 (Third-party software)

Product Focus :
MATLAB
Deep Learning Toolbox
Openvino R4 (Third-party Software)

Written at 28 January 2018

인용 양식

Kevin Chng (2024). MATLAB to OpenVINO (Intel-Inteference) (https://www.mathworks.com/matlabcentral/fileexchange/70330-matlab-to-openvino-intel-inteference), MATLAB Central File Exchange. 검색됨 .

MATLAB 릴리스 호환 정보
개발 환경: R2018b
모든 릴리스와 호환
플랫폼 호환성
Windows macOS Linux
카테고리
Help CenterMATLAB Answers에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
버전 게시됨 릴리스 정보
1.0.0