Tensorflow
The popular WebAssembly System Interface (WASI) provides a design pattern for sandboxed WebAssembly programs to securely access native host functions. The WasmEdge Runtime extends the WASI model to support access to native Tensorflow libraries from WebAssembly programs. The provides the security, portability, and ease-of-use of WebAssembly and native speed for Tensorflow.
You need to install and Rust.
Build
Check out the example source code.
Use Rust Cargo
to build the WebAssembly target.
Run
The wasmedge-tensorflow-lite
utility is the WasmEdge build that includes the Tensorflow and Tensorflow Lite extensions.
To make Tensorflow inference run much faster, you could AOT compile it down to machine native code, and then use WasmEdge sandbox to run the native code.
$ wasmedgec target/wasm32-wasi/release/classify.wasm classify.wasm $ wasmedge-tensorflow-lite classify.wasm < grace_hopper.jpg It is very likely a <a href='https://www.google.com/search?q=military uniform'>military uniform</a> in the picture
Code walkthrough
It is fairly straightforward to use the WasmEdge Tensorflow API. You can see the entire source code in .
First, it reads the trained TFLite model file (ImageNet) and its label file. The label file maps numeric output from the model to English names for the classified objects.
Then, the program runs the TFLite model with its required input tensor (i.e., the flat image in this case), and receives the model output. In this case, the model output is an array of numbers. Each number corresponds to the probability of an object name in the label text file.
Let’s find the object with the highest probability, and then look up the name in the labels file.
Finally, it prints the result to STDOUT
.
All the tutorials below use the WasmEdge Rust API for Tensorflow to create AI inference functions. Those Rust functions are then compiled to WebAssembly and deployed together with WasmEdge on the cloud.
Serverless functions
The following tutorials showcase how to deploy WebAssembly programs (written in Rust) on public cloud serverless platforms. The WasmEdge Runtime runs inside a Docker container on those platforms. Each serverless platform provides APIs to get data into and out of the WasmEdge runtime through STDIN and STDOUT.
- Netlify Functions
- Tencent Serverless Functions (in Chinese)
The following tutorials showcase how to deploy WebAssembly functions (written in Rust) on the Second State FaaS. Since the FaaS service is running on Node.js, you can follow the same tutorials for running those functions in your own Node.js server.
Service mesh
Data streaming framework
The following tutorials showcase how to deploy WebAssembly functions (written in Rust) as embedded handler functions in data streaming frameworks for AIoT.
- starts the WasmEdge Runtime to process image data as the data streams in from a camera in a smart factory.