Google Lens already has plenty of fancy AR tricks up its sleeve, but it will soon get what might be its most useful feature yet, called Scene Exploration.
At Google I/O 2022, Google previewed the new feature, which it says acts like a “Ctrl+F” shortcut to find things in the world in front of you. Hold your phone’s camera up to a scene and Google Lens will soon be able to overlay helpful product information to help you make quick choices.
Google’s demo example of the feature was shelves of chocolate bars, which Lens overlaid with information including not only the type of chocolate (eg, dark), but also their customer rating.
In theory, this Google Lens feature could be super powerful and save you a lot of time, especially when it comes to shopping. And Google says it runs on smart, real-time technology, including Knowledge Graphs that bring together multiple streams of information to give you local tips.
The wrong side? Scene Exploration doesn’t yet have a release date, with Google saying it will arrive “in the future”, with no specific timeline. That means it could be one to drop alongside early Google Lens promises, which took a few years to mature. But that doesn’t look like a huge leap forward from Lens’ existing shopping tools, so we’re hoping to see the first signs of that this year.
Analytics: One of the most useful AR tricks to date
There’s no doubt that Scene Exploration mode has huge potential for shopping, with old-school store browsing likely to increasingly take place behind a phone screen – or maybe eventually, glasses. smart.
But Google says it also has more benevolent apps. This feature could apparently help conservationists identify endangered plant species or give volunteers a convenient way to sort through donations.
Either way, it certainly looks like a powerful and intuitive development of another Lens feature that Google announced at I/O 2022, called Multi-Search. This allows you to combine image search with a keyword to help you find obscure products or objects, without needing to know their name.
Multi-Search arrived in Google Search last month (check in the Search app on Android or iOS), and soon you’ll be able to use a more specific version called “Near Me”. The Google example involved taking a photo of a certain dish and then searching for local restaurants that serve that particular dish.
You could say that these kinds of features turn us all into idiots, helplessly dependent on the crutch of Google’s powerful Lens and Search technology. But features like scene exploration and multi-search look like some of the most useful examples of AR we’ve seen, and their versatility should prove a boon for all types of users.
Now we just have to wait and see how long they take to fully materialize on Google Lens.