Run machine learning models, taking advantage of hardware acceleration. Currently, javascript libraries must parse a model and call WebGL, WebGPU and WASM for the individual compute operations. Providing a model loader API gains performance by optimizing in the browser and potentially taking advantage of new ML processors like TPUs.
Motivation
Model Loader is a proposed web API to load a custom, pre-trained machine learning model in a standard format, compile it for the available hardware, and apply it to example data in JavaScript in order to perform inference, like classification, regression, or ranking. The idea is to make it as easy as possible for web developers to use a custom, pre-built machine learning model in their web app, across devices and browsers. Performing inference locally can: - Preserve privacy, by not shipping user data across the network - Improve performance, by eliminating network latency and taking advantage of hardware acceleration, including specialized hardware not available with WebGL, WebGPU or WASM. - Provide a fallback if network access is unavailable, possibly using a smaller and lower quality model Unlike the Shape Detection API, the model loader APIs are generic. Application-specific libraries and APIs could be built on top.
Specification
Specification being incubated in a Community Group
Status in Chromium
In developer trial (Behind a flag)
(tracking bug)
Consensus & Standardization
- No signal
- No signal
- No signals
Owners
Intent to Prototype url
Intent to Prototype threadLast updated on 2022-05-12