I’d love persistent, ambient authentication in iPhone XI

People don't want Touch ID or Face ID — they just want locked-down security without having to type a passcode.

OK. iPhone IX or iPhone 9 — or whatever Apple calls the 2018 iPhone flagship — might be overly aggressive. But the heart wants what the heart wants.

Apple has now shipped iPhone X, the company's first device with an edge-to-rounded-edge display. That drove Apple to replace Touch ID, its fingerprint identity sensor, with Face ID, a new facial identity scanner. But not without some losses and compromises.

Still, just like Touch ID was faster and more convenient than passcode and Touch ID 2 was so fast it barely felt like authentication, Face ID is almost transparent. Most of the time, your phone unlocks or your app authenticates and you're left staring at the fading animation, just beginning to realize you've been identified, when you're already in.

There are a few times, though, when your finger moisture has changed or you're wearing gloves, or your face is at an odd angle or you're all bundled up, where "it just works" just stops working. It's not often and it's not a lot, but it's enough to shatter the illusion. It's enough to make me want something even better and more transparent.


Prefer to listen rather than read? Hit play on the podcast version:

Subscribe for more: Apple Podcasts | Overcast | Pocket Casts | Castro | RSS


The future of authentication

Imagine a future iPhone where authentication is ambient and perpetual, not requiring a specific fingerprint or biometric challenge/response but continuously grabbing snippets of biometric and other data. And imagine it would use that data to maintain a state of "trust" where your iPhone is simply unlocked for as long as it can be reasonably (or strictly, depending on settings) certain it's in your possession, challenging only when that state becomes uncertain.

There are already rumors of Apple incorporating Touch ID into the capacitive display, rather than a discrete capacitive home button, in this year's iPhone. There are also patents for microLCD technology that further enhances screen-as-fingerprint reading. Is the idea that parts, areas, or even the entire iPhone display being able to read at least partial fingerprint data really that far off?

iPhone cameras are already doing face and object detection for photographic effects and tagging. Microsoft is using them for facial recognition-based authentication. We look at our screens often and it's more than possible they could not only start looking but constantly authenticating back.

The same is true for infrared-based iris scanning. Samsung has recently introduced both facial recognition and iris scanning into its Galaxy phones. Unfortunately, you can't use them at the same time, with the system intelligently switching between facial recognition in daylight and iris scanning in low light, but that can't be far off either.

Siri began doing the basics of voice printing a couple years ago. Now, when you use setup buddy on a new iPhone, it has you say a few simple phrases so it can distinguish your voice — and your voice queries and commands — from those of others. I don't believe it's robust enough for authentication yet, but companies like Nuance have been offering just those kinds of "my voice is my passport, authorize me" services for a while. It's not tough to see Apple using the multiple, beam-forming mics on iPhones and AirPods to constantly check for your voice either.

It's the coprocessor in iPhone that allows for low-power "Hey, Siri", the same co-processor that serves as the sensor fusion hub for accelerometer, manometer, barometer, and other data. Right now that's used for things like health and fitness apps and games. Taken further, though, could gait-analysis be used to record and check your walking and motion patterns, so as you move around your iPhone can know it's you that's doing the moving?

Beyond biometrics

Biometric data could also be supplemented by other factors, like trusted objects. Previously, trusted objects were dumb — grab someone's dongle and you got into their phone. With Apple Watch, though, trusted objects got smarter. Auto Unlock on macOS, which uses the proximity of your Apple Watch to authenticate you for your Mac, feels downright magical. You authenticate on the watch via passcode or Touch ID on iPhone, then that authentication is further projected from Watch to Mac.

So could environmental data. For example, if you're in a certain place at a certain time that fits your existing patterns, that could add to the trust weighting.

Taken separately, each of these authentication methods either requires user action or doesn't provide enough security to be useful. Taken together though, every touch of the display provides a partial print, every glance at the camera provides a partial face or iris scan, every word a partial voice print, every step a partial gait analysis, and if a paired Apple Watch is proximate and you're in a place, at a time, that fits your pattern, enough factors pass authentication and the moment your iPhone senses any engagement, it's already unlocked and ready to be of service.

Conversely, any time enough factors fail authentication, your phone goes into lockdown and challenges for a proper fingerprint, iris scan, or passcode/password to make sure you're really you. And it could escalate for situations that warrant it. That's what happens today, for example, after a reboot, timeouts, software updates, etc. For secure enterprise or government use, it could do so more often and require multiple factors to resume a trusted state.

Not if, when

We'll need considerable advances in battery chemistry and strict adherence to privacy policies to enable this kind of technology, but Apple is uniquely positioned to deliver both. Just like chipsets, they don't have to worry about acting like a battery vendor, and unlike data harvesting companies, they don't want or need any of the personal information this surfaces.

To me, arguing about whether or not Touch ID or Face ID are better misses the point. Touch ID isn't there for Touch ID's sake. Face ID isn't there for Face ID's sake. Both are solutions to the same problem and, in the future, there will either be still better, faster, and easier ways to solve that problem. Or simply make it disappear so it no longer needs solving.

Historically, that seems like the approach Apple takes.

Originally published on April 27, 2017. Updated November 24, 2017, to reflect the launch of iPhone X and Face ID.

Comments are closed.