ARKit face tracking FUD and what you need to know

Worried face tracking in ARKit will give developers access to your Face ID biometric data? Well, it can't and here's why.

With iPhone X and the TrueDepth camera, Apple is introducing two very different systems: Face ID, which handles biometric authentication, and face tracking for ARKit, which lets augmented reality apps mimic your facial expressions. The two are, internally, completely separate But since the TrueDepth camera powers both, there's been some confusion and concern over how Apple's handling biometric face data and what, if any, access to it developers might have. Let's clear that up.

What is Face ID and how does it work?

Face ID is similar to Touch ID. Both are biometric identity systems that let you more quickly and conveniently unlock your iPhone and authenticate transactions. Where Touch ID uses your fingerprint as captured by the sensor in the Home button, Face ID uses your face data as captured by the TrueDepth camera on iPhone X.

From my Face ID explainer:

Once you've registered [your face] with Face ID, and you go to unlock, here's what happens:

  1. Attention detection makes sure your eyes are open and you're actively and deliberately looking at your device. This is to help avoid unintentional unlock. (It can be disabled for accessibility if desired.)
  2. The flood illuminator makes sure there's enough infrared light to "see" your face, even in the dark.
  3. The dot projector creates a contrasting matrix of over 30,000 points.
  4. To counter both digital and physical spoofing attacks, a device-specific pattern is also projected.
  5. The True Depth camera reads the data and captures a randomized sequence of 2D images and depth maps which are then digitally signed and send to the Secure Enclave for comparison. (Randomized to again counter spoofing attacks.)
  6. The portion of the Neural Engine inside the Secure Enclave converts the captured data into math and the secure Face ID neural networks compare it with the math from the registered face.
  7. If the math matches, a "yes" token is released and you're on your way. If it doesn't, you need to try again, fall back to passcode, or stay locked out of the device.

For developers, it works like Touch ID:

Just like apps never got access to your fingerprints with Touch ID, they never get access to your face data with Face ID. Once the app asks for authentication, it hands off to the system, and all it ever gets back is that authentication (or rejection).

What developers can get isn't face data but face tracking — through ARKit.

What is face tracking in ARKit and how does it work?

ARKit is Apple's framework for augmented reality. It handles everything from plane detection to lighting and scaling. Developers have already gotten ARKit apps to do things like lipstick and makeup previewing, but with the TrueDepth camera on iPhone X, much more specific support is possible.

Here's how it works:

  1. The app asks permission to access the camera (if you're using it for the first time).
  2. The TrueDepth camera creates a coarse 3D mesh, matching the size, shape, and topology, position, and orientation of your face, and your current facial expression.
  3. ARKit provides that information to the app.

At no point does the app (or developers) communicate at all with the Secure Enclave or get any of the Face ID biometric data stored therein.

In other words, the app knows there's a face and what it's doing but it has no idea whose face it is and gets none of the precise details Face ID matches against.

What ARKit gets that Face ID doesn't is anchor points in 3D space. So, apps can attach funny eyebrows and keep them attached as you move around. That's it.

Just like an app can tell where, when, and how you're touching the display, but can't identify your fingerprints, ARKit can tell how you're looking at the TrueDepth camera, but only so far as to map your movements and expressions to a poop emoji.

Making Face Matching privacy even more granular

One thing I would like to see in future versions of iOS is separate privacy settings for face matching. Asking for camera access if fine for an app that only wants camera access for ARKit face tracking, since you can grant or revoke it at any time to precisely control the tracking.

For apps that might want camera access for more than just ARKit face tracking, though, it's an all-or-nothing equation. Either you get all the features or none. You can't pick or choose just the ones you're comfortable with.

A discreet setting for Face tracking would be both more transparent and more flexible for everyone.

Any face-based questions?

New technology is always confusing and it's good to be cautious. Some people still tape over the selfie cams on their phones and laptops as a matter of course. In the end, it's up to each individual to learn as much as possible and then make the best decision between security and convenience for them.

I'm rather paranoid by nature but, based on everything I've seen to date, I'm confident there's no way for developers or anyone else to get at my biometric face data with Touch ID or ARKit, just like they haven't been able to get to my biometric fingerprint data with Touch ID or multitouch.

But the more tests and the more questions, the better. So keep 'em coming!

*Originally published September 27, 2017. Updated November 30, 2017, with a proposal for separate ARKit privacy settings.

Comments are closed.