onnxjs-node is a Node.js binding of ONNXRuntime that works seamlessly with ONNX.js.
Install the latest stable version:
npm install onnxjs-node
NOTE: binary files will be pulled from github during the npm install
process.
OS | Arch | CPU/GPU | NAPI version | ONNXRuntime version |
---|---|---|---|---|
Windows | x64 | CPU | v3 | v0.4.0 |
Linux | x64 | CPU | v3 | v0.4.0 |
macOS | x64 | CPU | v3 | v0.4.0 |
Windows | x64 | GPU | v3 | v0.4.0 |
Linux | x64 | GPU | v3 | v0.4.0 |
There are 2 options to import onnxjs-node
.
- Option 1 - replace
onnxjs
byonnxjs-node
://const onnx = require('onnxjs'); const onnx = require('onnxjs-node'); // use 'onnx' // ...
- Option 2 - add a single line to require
onnxjs-node
:const onnx = require('onnxjs'); require('onnxjs-node'); // this line can be put on the top as well // use 'onnx' // ...
After onnxjs-node
is imported, the default inference session class of ONNX.js will be overwritten. Any existing ONNX.js code will continue to work and model will run by ONNXRuntime backend.
Coming soon...
After onnxjs-node
is imported, ONNXRuntime backend will be used by default. However, it is possible to fallback to other backend by specifying the session option backendHint
:
session = new onnx.InferenceSession({backendHint: 'wasm'}); // use WebAssembly backend
- ONNX.js Home
- ONNXRuntime
- Nuget package: Microsoft.ML.OnnxRuntime
- Nuget package: Microsoft.ML.OnnxRuntime.Gpu
Copyright (c) fs-eire. All rights reserved.
Licensed under the MIT License.