The amount of content and data keeps growing, year after year, and the fragmentation of content has become a real issue. Companies have content everywhere, on numerous platforms. How do you ensure that people find the information they want when there are so many search options?
This week, Apple unveiled its own version of Google Lens in the form of Live Text. In response, Google just hit back with a new feature for its visual search tool called Places, a new search category that can recognize landmarks and return information on them within the camera view, which Apple touted as a capability of Live Text during its WWDC keynote. Don't Miss: Nintendo Crafts AR Experience via Google Lens for Pokémon Sword & Shield Game Packaging Places for Google Lens, which is available now worldwide, uses image recognition and Google Earth's 3D map assets to identify locations... more