Mobile augmented reality for iOS is frequently via ARKIT by apps, but Apple lends just a few pages from Google's Playbook and brings AR on to iOS 15.
During the keynote presentation at WWDC 2021, Craig Federighi, Senior Vice President of Software Engineering at Apple, Live Text, revealed a brand new camera mode that involves iOS 15 that provides Google Lens for Android -Smartphones and Google photos.
With a live text, iPhone users can refer their recording photos and interact with text. In addition to copying text, users can seek for chosen text or call up a telephone number recognized within the image. The same functionality shall be available via the photos on iOS and MacOS Monterey. Live -Text also replicates the power of Google Lens, pet breeds, plants, products, art and sights.
In addition, in iOS 15 Apple, a separate version of the AR Walking Navigation mode, which we’ve got seen in Google Maps Live View.
Like Live View, Apple Maps AR navigation requests will provide the camera view, while iPhone users navigate from point A to point B. AR mode will start in a limited series of cities, namely London, Los Angeles, New York, Philadelphia, San Diego, San Francisco Bay Area and Washington, DC, later this yr.
While Apple is catching up with Google in relation to the AR wandering navigation, Google is on the solution to add other useful functions for the live view, e.g.
The Apple founder Steve Jobs once quoted famous for Picasso: “Copy good artists; stealing great artists.” Google Lens and Live View are easily two of the more useful AR functions that Google has introduced lately. So for Apple it’s that Apple wants the identical functionality in iOS. In addition, adding these functions as a part of iOS iOS iPhone users prefer to prefer apps from third-party providers.
Cover picture about Apple/YouTube