TensorFlow

The interpreter supports the WasmEdge TensorFlow lite inference extension so that your JavaScript can run an ImageNet model for image classification. This article will show you how to use the TensorFlow Rust SDK for WasmEdge from your javascript program.

Here is an example of JavaScript. You could find the full code from example_js/tensorflow_lite_demo/.

  1. import {Image} from 'image';
  2. import * as std from 'std';
  3. import {TensorflowLiteSession} from 'tensorflow_lite';
  4. let img = new Image('food.jpg');
  5. let img_rgb = img.to_rgb().resize(192, 192);
  6. let rgb_pix = img_rgb.pixels();
  7. let session = new TensorflowLiteSession(
  8. 'lite-model_aiy_vision_classifier_food_V1_1.tflite');
  9. session.add_input('input', rgb_pix);
  10. session.run();
  11. let output = session.get_output('MobilenetV1/Predictions/Softmax');
  12. let output_view = new Uint8Array(output);
  13. let max = 0;
  14. let max_idx = 0;
  15. for (var i in output_view) {
  16. let v = output_view[i];
  17. if (v > max) {
  18. max = v;
  19. max_idx = i;
  20. }
  21. }
  22. let label_file = std.open('aiy_food_V1_labelmap.txt', 'r');
  23. let label = '';
  24. for (var i = 0; i <= max_idx; i++) {
  25. label = label_file.getline();
  26. }
  27. label_file.close();
  28. print('label:');
  29. print(label);
  30. print('confidence:');
  31. print(max / 255);

To run the JavaScript in the WasmEdge runtime, you can do the following on the CLI to re-build the QuickJS engine with TensorFlow and then run the JavaScript program with TensorFlow API.

  1. $ cargo build --target wasm32-wasi --release --features=tensorflow
  2. ... ...
  3. $ cd example_js/tensorflow_lite_demo
  4. $ wasmedge-tensorflow-lite --dir .:. ../../target/wasm32-wasi/release/wasmedge_quickjs.wasm main.js
  5. label:
  6. Hot dog
  7. confidence:
  8. 0.8941176470588236

Note: the --dir .:. on the command line is to give wasmedge permission to read the local directory in the file system for the main.js file.

Note

  • The --features=tensorflow compiler flag builds a version of the QuickJS engine with WasmEdge TensorFlow extensions.
  • The wasmedge-tensorflow-lite program is part of the WasmEdge package. It is the WasmEdge runtime with the Tensorflow extension built in.

You should now see the name of the food item recognized by the TensorFlow lite ImageNet model.

Make it faster

The above Tensorflow inference example takes 1–2 seconds to run. It is acceptable in web application scenarios but could be improved. Recall that WasmEdge is the fastest WebAssembly runtime today due to its AOT (Ahead-of-time compiler) optimization. WasmEdge provides a wasmedgec utility to compile and add a native machine code section to the wasm file for much faster performance.

The following example uses the extended versions to wasmedge and wasmedgec to support the WasmEdge Tensorflow extension.

  1. $ cd example_js/tensorflow_lite_demo
  2. $ wasmedgec-tensorflow ../../target/wasm32-wasi/release/wasmedge_quickjs.wasm wasmedge_quickjs.wasm
  3. $ wasmedge-tensorflow-lite --dir .:. wasmedge_quickjs.wasm main.js
  4. label:
  5. Hot dog
  6. confidence:
  7. 0.8941176470588236

You can see that the image classification task can be completed within 0.1s. It is at least 10x improvement!