Thoughts on Apple and fingerprint based biometrics

Yesterday Apple finally confirmed what their acquisition of Authentec was all about, integrating fingerprint based biometrics in the iPhone. This was not exactly a surprise but its one thing to know they were going to do it and another to see how they went about it.

Details are still a little light on the implementation and there appears to be a lot of speculation about how they did things but I have not seen any one provide a reasonable write up of how this technology works, what its limitations are and the value it has.

Let’s start with what these things actually do, plain and simple a fingerprint sensor is a camera. It takes a picture of the structure of your finger, some sensors look just at the surface, some look a little below the surface. Different sensors use optical capture, while others use capacitance and other mechanism but regardless of all of this they all take a picture of your finger.

This picture is then processed looking for “minutia”, the little details that make that image unique (ridges, valleys, swirls, etc.). These are then mapped into something commonly called a template, it is this template that is stored.

clip_image001

The important thing to take away here is when you are enrolling the biometric sample itself isn’t actually stored, it’s simply not needed.

Once you are enrolled the same process happens (capture image, identify minutia, create template) except instead of storing it this time it is compared to the stored template. Now each time you log in you present your finger slightly differently, this means that not all of the same minutia will be seen in every captured image.

As a result the matcher has to guess if the person is you or not based on how many minutia it sees in common with the stored template.

This works fairly well when doing what is referred to as verification, this is when the sample is compared to just one sample as is probably the case when dealing with a device like the iPhone. When doing identification though (the one to many variant) there are a number of other problems to consider; I won’t discuss that here.

Now each of the image capture approaches used by sensors have different security properties; for example with optical sensors I have seen people lift the fingerprint from the sensor glass itself and re-use it.

According the press conference sensor used by the iPhone looks sub-dermally, the primary thing this helps with is resilience to small cuts and scrapes that could push the threshold authentication done with biometrics over the edge making it impossible to match you – it also does provide some security value in that the characteristics are not exactly the same ones you leave everywhere.

Now the good sensors also have logic in them to detect fake fingers, some of these are simple live-ness tests while others look at the characteristics of the flesh itself. For example a swipe sensor may look at the elasticity of the finger as it is dragged across the sensor.

One of the real problems here is that when you are buying a device with one of these sensors on them you have zero clue how good the mechanisms (if any) they use are. In my case I went and bought several fingerprint handgun boxes that had biometrics cracked them open to see who manufactured the sensor and contacted the heads of the engineering departments at the sensor companies (that I happened to know due to my work) and I had them help me figure out which device had the best fake finger detection so I knew which one to use.

In the case of the Apple sensor again we have no clue what kind of fake finger / live-ness tests they have implemented. I am sure thanks to security researchers once the devices ship we will found out how effective they are dealing with this in short order.

But what happens when the matcher decides there’s a good chance it’s you? It releases a “secret” and what is that secret you ask? Well in most systems it’s actually the password the user would have entered had the sensor not been there.

That’s right, all of the above magic to make entry of the password easier. This isn’t actually a bad thing, but again it depends on how it was implemented.

As a practical matter people can remember 7-9 character passwords, these passwords get re-used or trivially modified which greatly reduces their effectiveness. By using biometrics to gate access to them we can replace that short junky password with a longer key, in the process we can greatly increase the security of a system as a result.

Here is the thing — it doesn’t sound like that’s what they have done here, it seems they have applied the biometric to the four digit numeric pin and made the application store password protected by that pin. I say this because (according to the New York Times) which you will have to set the pin still for recovery purposes. If this is the case (and until we see them we will not know) the biometric is no more secure than the four digit pin its gating access to.

It could still have some value though, for example according to the press conference around 50% of iPhone users set the pin if this can meaningfully increase that number on aggregate were in a better world.

One more troubling thing for me as I think about the Apple integration is that they are one of the most secretive tech companies out there and were not likely to hear the answers of how they have handled the above issues or any others.

Leave a Reply

Your email address will not be published. Required fields are marked *