TensorFlow Lite is an open-source deep learning framework for on-device inference working on Android, iOS, and Windows.
This framework comes with a collection of the TensorFlow Lite applications.
The TensorFlow Lite can run on CPU, GPU, or via NNAPI.
TensorFlow Lite is a set of tools and consists of two main components: TensorFlow Lite interpreter and TensorFlow Lite converter.
The TensorFlow Lite interpreter runs optimized models on many different hardware types, including mobile phones, embedded Linux devices, and microcontrollers.
The TensorFlow Lite converter converts TensorFlow models into an efficient form for use by the interpreter and can introduce optimizations to improve binary size and performance.
All examples can be found at the GitHub repository.
This source code from documentation shows us how simple is development with the TensorFlow Lite and python:
import tensorflow as tf
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
tflite_model = converter.convert()