Skip to content
This repository was archived by the owner on Nov 16, 2023. It is now read-only.

Files

Failed to load latest commit information.

Latest commit

 Cannot retrieve latest commit at this time.

History

History

benchmark

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Benchmarks

This sub project is to benchmark and compare ONNX.js peformance vs other leading in-browser AI Inference frameworks.

Frameworks

  • TensorFlow.js
  • Keras.js
  • WebDNN
  • ONNX.js

Backends

(not all backends supported by all platforms)

  • WebGL
  • WebAssembly
  • CPU

Browsers

(not all framework/backend combinations are supported by all browsers)

  • Chrome (WebGL 2)
  • Edge (WebGL 1)

Instructions

Please download all the sub-folders (containing the model files and corresponding test data) under https://github.com/Microsoft/onnxjs-demo/tree/data/data/benchmark and place them in ./benchmark/data prior to running the benchmark tests

  1. Ensure that the ONNX.js project (the parent) is already installed and built:
npm ci
npm run build
  1. Change to benchmark subfolder and run npm ci and build in the benchmark folder
cd benchmark
npm install
npm run build
  1. Run tests (Chrome)
npm run test
  1. Run tests (Edge)

Note that the Edge tests are likely to crash the broswer. A recommended way would be to comment out all Frameworks and backends except one and repeat this for all others. Look in the definition for BenchmarkImageNetData in src/index.js

npm run test-edge