Modern smartphones have become “smart” in the real sense with features like machine learning and AI included in many apps. This facilitates functions like text recognition, translation in real-time, barcode scanning, precise tracking, and detection of objects. One prominent application of this technology is augmented reality apps that track the facial features to create overlay of masks or beautification elements. This all is possible due to the ML Kit which was introduced in May 2018 and now has grown its presence in more than 25,000 apps.
For now, these functionalities rely on cloud-based networks for the processing and machine learning applications. This is due to the dependence on Firebase mobile and web development platform. At times it can raise a question about security and data privacy. Also, network latency is another issue that can be worrisome.
To get over this Google has introduced a new ML Kit package which will be on the device APIs as an independent SDK, which doesn’t require Firebase project. Of course, the current cloud-based model will also be there to take advantage of, but gradually the transition from online to the locally available mode will be the norm.