LightBlog

mardi 22 mai 2018

On-device machine learning coming to Chrome OS

Machine Learning is one of those hotwords that keeps gaining traction inside Google. It has seeped into every branch of its business from the Google Assistant to Gmail. New code indicates that Google is now working to support machine learning on Chrome OS.

The implementation looks similar in concept to TensorFlow Lite for Android, where clients only perform inference from existing machine learning models rather than training and development of models. The current documentation is light so we don't know what the full impact on user-experience (if any) will be once development is finished. Here's what the readme for the Chrome OS machine learning service has to say:

"The machine learing [sic] service provides a common runtime for evaluating machine learning models on device. The service wraps the TensorFlow runtime which has been optimized to support the set of built-in machine learning models with [sic] are installed on rootfs."

While functionality is still in development, we can see the Chrome developers have already scoped out how it will be used from a couple of commits. The machine learning service will allow applications to make use of machine learning models that are pre-loaded onto the device. So far two models have been identified, TAB_DISCARDER and POWER_MANAGER, suggesting that in practice machine learning will not introduce funky new features—at least just yet—but will enable smarter system resource management.

While the models identified so far don't seem noteworthy, it will be interesting to see how they will impact the already-stellar battery life on the Chrome OS platform. These models could also pave the way for flashier use-cases down the line.

On-device inference is available on Android already with TensorFlow Lite, so the addition of this service raises a few unanswered questions: Will the Android container on Chrome OS make use of the new machine learning runtime instead? Will an API be exposed for third-party developers to make use of it? Will the service be limited to system resource management?

It's early days yet, but model inference coming to Chrome is another step towards feature-breadth and maturity that the OS isn't commonly known for. More on this feature as it develops.



from xda-developers https://ift.tt/2s3TL2i
via IFTTT

Aucun commentaire:

Enregistrer un commentaire