One session at his year's WWDC even goes into how it can identify how a hand is posed or moving, which lays the groundwork for advanced hand gestures, which are a big part of the interface in current AR headsets like Microsoft Hololens.Shown at: Balice Hertling, in Art Basel Miami Beach’s Nova section
The next big thing 2021 software#
Apple's computer vision software can now identify objects inside images, including text on signs, as well as the ability to search for things inside photos - like a dog or a friend.Ĭombined with Apple's other tools, these AI tools can apply affects similar to Snap's filters. Apple's Vision framework software can be called by apps to detect people, faces, and poses through the iPhone's camera. While Apple's machine learnings and artificial intelligence tools aren't directly tied to augmented reality, they represent abilities that will be important for a computer interface that works in 3D spaces. In a video session for developers, Apple said it is using the tool to create AR direction overlays in Apple Maps - a potentially useful scenario for a head-mounted AR device.ĪI for understanding hands, people, and faces. This year it includes something called "location anchors," which means that software makers can program AR experiences pegged to map locations in in London, New York, Los Angeles, San Francisco, and a few other U.S. This is Apple's fifth major version of the software since it first came out in 2017. ARKit is another set of software tools for making AR experiences, but is more closely focused on figuring out where to place digital objects in the real world. Aside from Object Capture, there are a lot of little improvements to make app makers' lives easier in RealityKit 2, including improved rendering options, a way to organize images and other assets, and new tools to make player-controlled characters inside augmented reality scenes.ĪRKit 5.
The next big thing 2021 update#
Object Capture is just one part of a significant update to RealityKit, which is its set of software tools for making AR experiences. For now, it will likely be used heavily in e-commerce. Previously, precise and pricey camera setups were required for detailed object scanning.Įventually, third party developers like Unity, a top AR engine maker, will include it in their software. Instead, it's a technology that allows a camera, like the iPhone's camera, to take several photographs of an object, then stitch them together into a 3D model that can be used inside software in minutes. If an app doesn't have an accurately detailed file for a shoe, then it can't use Apple's machine vision software to place it on a table. 3D models are essential for AR, because they're what the software places in the real world. Apple has introduced application programming interfaces, or software tools, that will enable apps to create 3D models.
"Out of view today is Apple's intense ongoing development related to new product categories around augmented reality wearables and transportation." What Apple announced
"From a high level, this year, and maybe even next year's WWDC event, will amount to a calm before an Apple innovation storm," Loup Ventures founder and longtime Apple analyst Gene Munster wrote in an email this week. CEO Tim Cook has said AR is the "next big thing." Getting developers on board to build augmented reality software now increases the chance of one or more "killer apps" being available at launch.Īpple did not spend much time on augmented reality at its WWDC launch keynote on Monday, but announced several updates during the conference's more technical parts shows that it remains an important long-term initiative for Apple. In order to succeed with an augmented reality device, Apple will have to come up with strong reasons for people to use it - and that comes down to useful software, just as apps like Maps, Mail, YouTube, and the mobile Safari browser helped spur adoption of the original iPhone.