Skip to content

Demonstrate how to use ONNX importer API in Intel OpenVINO toolkit. This API allows user to load an ONNX model and run inference with OpenVINO Inference Engine.

License

Notifications You must be signed in to change notification settings

yas-sim/openvino-onnx-importer-api

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Overview

ONNX importer API is introduced from Intel(r) Distribution of OpenVINO(tm) toolkit 2020.2 version. It allows user to load an ONNX model and convert it into an nGraph model. Furthermore, user can import the nGraph model to OpenVINO CNNNetwork model so that the user can use a familiar Inference Engine API to run the model. The user can run an ONNX model on any OpenVINO supported accelerators such as CPU, integrated GPU, VPU (Myriad) and FPGA.
This project will demonstrate how to use the ONNX importer API.

ONNX model -(ONNX importer API)-> nGraph model -> CNNNetwork (an Inference Engine object)

ONNX importer APIはIntel(r) Distribution of OpenVINO(tm) toolkit 2020.2バージョンから導入されたAPIです。これを使うことによってONNXモデルを読み込み、nGraphモデルに変換することが可能になります。また、このnGraphモデルをInference EngineのCNNNetworkモデルに変換することで使い慣れたInference Engine APIを使った推論が可能になります。ユーザーはONNXモデルをCPU, 内蔵GPU, VPU (Myriad), FPGAなどのOpenVINOがサポートするすべてのデバイスで実行可能になります。
このプロジェクトではONNX importer APIの使い方の例を示します。

1. Prerequisites

2. Preparing required files (ONNX model, class label text file, input image)

This sample program supports image classification models such as Googlenet, ResNet, Squeezenet, Mobilenet and so on.

Linux

# Download an ONNX model (ResNet-50)
wget https://github.com/onnx/models/raw/master/vision/classification/resnet/model/resnet50-v2-7.onnx
cp resnet50-v2-7.onnx model.onnx
# Download a class label text file
wget https://raw.githubusercontent.com/HoldenCaulfieldRye/caffe/master/data/ilsvrc12/synset_words.txt
# Copy an input image
cp ${INTEL_OPENVINO_DIR}/deployment_tools/demo/car.png .

Windows

# Download an ONNX model (ResNet-50)
bitsadmin /transfer download https://github.com/onnx/models/raw/master/vision/classification/resnet/model/resnet50-v2-7.onnx %CD%\resnet50-v2-7.onnx
copy resnet50-v2-7.onnx model.onnx
# Download a class label text file
bitsadmin /transfer download https://raw.githubusercontent.com/HoldenCaulfieldRye/caffe/master/data/ilsvrc12/synset_words.txt %CD%\synset_words.txt
# Copy an input image
copy "%INTEL_OPENVINO_DIR%\deployment_tools\demo\car.png" .

3. How To Build

Linux

mkdir -p build
cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
make
cd ..

Windows

mkdir build
cd build
cmake -G "Visual Studio 16 2019" -DCMAKE_BUILD_TYPE=Release ..
msbuild onnx-importer.sln /p:Configuration=Release
cd ..

4. How to Run

Linux

$ ./build/onnx-importer

Windows

C:> build\Release\onnx-importer.exe

5. Test Environment

  • Ubuntu 18.04 / Windows 10 1909
  • OpenVINO 2020.2 / 2020.3 LTS

Example of output log

C:>build\release\onnx-importer.exe
Loading class label file : synset_words.txt
Importing an ONNX model : model.onnx
Converting an nGraph model into CNNNetwork
1 : 817 : 1112.83% n04285008 sports car, sport car
2 : 511 : 1082.38% n03100240 convertible
3 : 479 : 999.68% n02974003 car wheel
4 : 436 : 996.005% n02814533 beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon
5 : 656 : 875.367% n03770679 minivan

'car.png'

car

See Also

About

Demonstrate how to use ONNX importer API in Intel OpenVINO toolkit. This API allows user to load an ONNX model and run inference with OpenVINO Inference Engine.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published