Tensorflow Lite in Faasm

Tensorflow Lite is well suited to runninginference tasks in a resource-constrained serverless context.

It is written in C/C++ hence we can build Tensorflow Lite to WebAssembly using thestandard Faasm toolchain.

Faasm currently only supports the C/C++ API, but building the Python API usingFaasm's Python support should be possible and on the to-do list.

Compiling TensorFlow Lite to WebAssembly

To do this, make sure you've checked out this project and updated all the git submodules,then, from the Faasm CLI:

  1. inv libs.tflite

The build output ends up at third-party/tensorflow/tensorflow/lite/tools/make/gen.

Building a C/C++ function with TF Lite

A function implementing image classification is includedin the examples. This is based on theexample in the TF Lite repo.The data and model for the example are stored in this repo.

To run the example function, you need to run a local Faasm cluster (as described in the README),then:

  1. # Upload files and state
  2. inv data.tf-upload data.tf-state
  3.  
  4. # Upload the function (takes a few seconds)
  5. inv upload tf image
  6.  
  7. # Invoke
  8. inv invoke tf image

Eigen Fork

To support WASM simd instructions I've hacked about with Eigen on afork. It seems to work but isn't welltested.

TF Lite will be compiled against the version of Eigen downloaded as part of its 3rd partydeps, so if you need to change it and rebuild you'll need to run:

  1. cd third-party/tensorflow/tensorflow/lite/tools/make
  2. rm -r downloads/eigen/
  3. ./download_dependencies.sh