Google Extends AR Search Lead Over Apple with Landmark Recognition for Google Lens

advertisement

This week, Apple unveiled its own version of Google Lens in the form of Live Text.

In response, Google just hit back with a new feature for its visual search tool called Places, a new search category that can recognize landmarks and return information on them within the camera view.

Places for Google Lens, which is available now worldwide, uses image recognition and Google Earth’s 3D map assets to identify locations. Once you activate the Places filter, Google Lens identifies the landmark and a hotspot appears over the subject in the camera view. In order to retrieve relevant information from Google Search about the landmark, you can simply tap the hotspot.

Images via Google

“Google Lens is now used over three billion times per month by people around the world,” wrote Lou Wang, group product manager for Google Lens and AR, in a statement. “We hope Lens makes rediscovering and learning about your city even more enjoyable.”

Although Apple’s Live Text can copy text from the camera view and photos, Google Lens can do that and identify dog breeds and plants, translate foreign languages, solve math problems, retrieve photos of menu items, and serve as a shopping assistant.

The Places launch is a convenient reminder from Google to the tech world that it has a considerable lead on Apple not only in visual search but also with visual positioning services. Among Apple’s other new augmented reality features is an AR navigation mode for Apple Maps, which runs on the new Location Anchors capability for ARKit 5. Both arrive this fall in iOS 15, but will be limited to a handful of US cities (and London, more on that later). You’ll also need at least an iPhone XS, XS Plus, or XR to use these features.

Meanwhile, Live View, the AR navigation mode for Google Maps, works on iPhones or Android devices that support ARKit or ARCore and in any location with sufficient Street View coverage. Moreover, Google is expanding Live View’s capabilities with an AR x-ray viewer and indoor navigation.

Just to rub it in a little more, Google used examples of London landmarks to demonstrate Places. London also happens to be one of the handful of locations that will support Apple Maps AR navigation and Location Anchors at launch. The timing is also particularly amusing when you remember that Google I/O was two weeks ago, and there wasn’t any news on Google Lens revealed then.

But being first isn’t everything, and leads can be lost. Google had the Measure app for measuring items in AR before Apple debuted its own app with the same name, but now Google is killing off its app, while Apple’s version has improved with the advent of LiDAR for iPhone Pro and iPad Pro models. Overall, what all of this tells us is that the AR map wars are in full swing, and the ultimate winner will be the user.

Cover image via Google

This content was originally published HERE

advertisement

Be the first to comment on "Google Extends AR Search Lead Over Apple with Landmark Recognition for Google Lens"

Leave a comment

Your email address will not be published.


*