Apple details new AR experiences with Location Anchors in ARKit 4

"Place AR experiences at specific places"

What you need to know

  • Apple has detailed some of the changes coming to ARKit 4.
  • It shows how developers can use the new framework to create new AR experiences for iPhone and iPad users.
  • One feature includes AR experiences anchored to a location.

Apple has detailed its changes to ARKit 4 in a new developer news release, including a new Location Anchors feature.

Apple announced ARKit as one of its new developer frameworks at WWDC 2020 this week. From the release:

ARKit 4 on iPadOS introduces a brand-new Depth API, creating a new way to access the detailed depth information gathered by the LiDAR Scanner on iPad Pro. Location Anchoring leverages the higher resolution data in Apple Maps to place AR experiences at a specific point in the world in your iPhone and iPad apps. And support for face tracking extends to all devices with the Apple Neural Engine and a front-facing camera, so even more users can experience the joy of AR in photos and videos.

Depth API will allow the LiDAR scanner to generate fast, realistic information about your surroundings for new features like taking measurements.

The coolest new feature is arguably Location Anchors:

Place AR experiences at specific places, such as throughout cities and alongside famous landmarks. Location Anchoring allows you to anchor your AR creations at specific latitude, longitude, and altitude coordinates. Users can move around virtual objects and see them from different perspectives, exactly as real objects are seen through a camera lens.

Apple Maps will use a localization map (so no location data goes to Apple) which could give users access to AR experiences based on their geographical experience, for example, developers could apply AR labels to buildings to help with navigation or sightseeing for tourists.

The final detailed feature is Expanded Face Tracking Support, which has been extended to any front-facing camera on a device with the A12 Bionic chip or later, including the iPhone SE. Developers can track up to three faces at once with the TrueDepth camera. You can read the full release, including a list of all of ARKit 4's new features here.

Comments are closed.