The Open Neural Network Exchange (ONNX) is an open standard for representing machine learning models. With ONNX, AI developers can more easily move models between state-of-the-art tools and choose combinations that are best for them. ONNX is developed and supported by a community of partners including AWS, Facebook OpenSource, Microsoft, AMD, IBM, Intel AI, etc.

ONNX.js

On November 29th, 2018 Microsoft entered the arena of AI on browsers with the announcement for ONNX.js, their open-source library to run ONNX models on browsers. This is yet another option for web developers for running ML models on browsers and build amazing user experiences on the web.

Why ONNX.js? — Lightning Performance! ⚡️

With the development of Keras.js & TensorFlow.js, Microsoft had to come up with a solution that can deliver better results along with a good developer experience. IMHO, Microsoft succeeded in the performance arena to a large extent. Here are a few things that make it stand out.

ONNX.js can run on both CPU and GPU.

For running on CPU, WebAssembly is adapted to execute models at near-native speed. Furthermore, ONNX.js utilizes Web Workers to provide a “multi-threaded” environment to parallelize data processing. This is a really a great feature, as Keras.js and TensorFlow.js don't support WebAssembly usage on any browser.

For running on GPUs, a popular standard for accessing GPU capabilities — WebGL is adopted.

Here are the results of benchmarking done by Microsoft. Read more about it here.

Despite having such outstanding performance attributes, ONNX.js lacks some basic utility functions, such as converting an image to a tensor, which is available in TensorFlow.js. Being an open source library, we can expect the community will add such utilities soon for developers.