At WWDC yesterday, Apple revealed several advanced tools that will enable developers to create compelling AR experiences. First, Apple announced the new RealityKit that features a photorealistic rendering, environment mapping and support for camera effects like noise and motion blur, to make virtual content nearly indistinguishable from reality. RealityKit also comes with animation, physics and spatial audio support.
Second, the ARKit 3 update brings Motion Capture, People Occlusion and support for multiple faces. With Motion Capture, developers can integrate people’s movement into their app. People Occlusion will allow developers to place AR content naturally in front of or behind people. I’m sure we will see several green screen-like applications in App Store in the coming months. ARKit 3 also enables the front camera to track up to three faces, as well as simultaneous front and back camera support.
Finally, Apple announced the new Reality Composer app for iOS, iPadOS and Mac. This new app with simple drag-and-drop UI will allow developers to easily prototype AR experiences. Apple has included a library of high-quality 3D objects and animations. Developers can arrange AR objects in 3D and it can be directly integrated into an app in Xcode or exported to AR Quick Look.
These tools clearly indicate that Apple is well ahead of Google and Microsoft in the consumer AR space. When Apple launches an AR headset in late 2020 or 2021, it will have several thousand consumer AR apps ready in App Store. More importantly, developers will be very familiar with Apple’s AR tools to create more advanced AR apps for headsets.