The new iPad Pro features a depth-sensing LIDAR scanner and now Apple has released ARKit 3.5 to help developers take advantage of the new technology.
ARKit 3.5 and the new LiDAR Scanner and depth-sensing system on iPad Pro allows developers to create AR experiences that are more realistic than ever before. The new Scene Geometry API lets developers to capture a 3D representation of the world in real-time, enabling object occlusion and real-world physics for virtual objects. All experiences enabled by ARKit automatically benefit from new instant AR placement, and improved Motion Capture and People Occlusion.
ARKit 3.5 brings the following new features:
Scene Geometry lets you create a topological map of your space with labels identifying floors, walls, ceilings, windows, doors, and seats. This deep understanding of the real world unlocks object occlusion and real-world physics for virtual objects, and also gives you more information to power your AR workflows.
The LiDAR Scanner on iPad Pro enables incredibly quick plane detection, allowing for the instant placement of AR objects in the real world without scanning. Instant AR placement is automatically enabled on iPad Pro for all apps built with ARKit, without any code changes.
Improved Motion Capture and People Occlusion
With ARKit 3.5 on iPad Pro, depth estimation in People Occlusion and height estimation in Motion Capture are more accurate. These two features improve on iPad Pro in all apps built with ARKit, without any code changes.
ARKit already offers features such as People Occlusion, Motion Capture, Simultaneous Front and Back Camera, Multiple Face Tracking, Collaborative world-mapping Sessions and more. Read more about the new features at Apple here.