Skip to content
This repository has been archived by the owner on Jul 22, 2021. It is now read-only.

artyomtugaryov/inference_engine_js

Repository files navigation

OpenVINO Inference Engine Node API addon (InferenceEngineJS)

WARNING: There is the main repository in an Intel organization. All development moved to the main repository. This repo was archived as duplicate.

Prerequisites

  1. Installed OpenVINO package
  2. Node v12

Build

Install dependencies:

npm install

Run the following command in a terminal to set environment for work with the Inference Engine

source ${INTEL_OPENVINO_DIR}/bin/setupvars.sh

Note: To work in IDE add to $LD_LIBRARY_PATH environment variables as in setupvars.sh

Build the addon.

You can build the addon with node-gyp or cmake.

To build the addon with node-gyp you should:

  1. Set ${INTEL_OPENVINO_DIR} environment variable with path to your OpenVINO package.
  2. Run the following command in the terminal
    npm run build

To build the addon with cmake you should:

  1. Set an environment variable NODE_PATH to path with NodeJS 12. Usually, this is ~/.nvm/versions/node/v12*
  2. Run a cmake command:
    mkdir cmake-build && cd cmake-build && cmake ../
  3. Build the addon:
    cmake --build . --target InferenceEngineAddon -- -j 6
  4. Now you are available to use JS wrapper. To run sample execute:
    npm run sample:hello-query-device