This platform will allow developers to deploy AI on mobile devices.
It enables on-rig machine learning inference with low latency and a small binary size.
TensorFlow Lite also supports hardware acceleration with theAndroid Neural Networks API.
Burkenotesthat it is a crucial step toward enabling hardware-accelerated neural online grid processing across Androids diverse silicon ecosystem.
TensorFlow Mobile allows developers to incorporate TensorFlow models that work in a desktop environment, on mobile devices.
However, applications created using TensorFlow Lite will be lighter and faster than similar applications that use TensorFlow Mobile.
Although, not all use cases are currently supported by TensorFlow Lite.
There are three models that are already trained and optimized for mobile devices.
Smart Reply: An on-gear conversational model that provides one-touch replies to incoming conversational chat messages.
First-party and third-party messaging apps use this feature onAndroid Wear.
We plan to prioritize future functional expansion based on the needs of our users.
For those interested to know more about TensorFlow Lite, can check the documentationhere.
source: www.techworm.net