There has been a rather high financial interest in biometric security solutions for the past few years. I’m very sceptical of biometric authentication solutions – they’re a privacy disaster waiting to happen.

For any authentication, which is the process where you identify yourself to a computer system in order to get access, you can use three things to get that access: you can identify with something you know (like a password), something you have (like a physical key), or something you are (like a fingerprint). Today, most authentications rely on something you know or have. A few use both, which makes them into so-called two-factor authentication.

On the sidelines, a number of companies are trying to make something good of the “what you are” concept – using something unique related to your body to give you proper access. A decade back, retina-based identification was the high-end rage in this segment. These days, fingerprint authentication is coming on strongly – tying your login to your fingerprints, especially with the fingerprint reader on the latest iPhone.

(By the way, having a fingerprint reader on a phone has to be one of the more dysfunctional security illusions sold in the past decade. After all, if your phone is stolen and the thief needs your fingerprint to unlock it, the thief doesn’t really need your actual finger: your fingerprints are literally all over the phone already. It’s your phone, you’ve been holding it and tapping every corner of its screen.)

But let’s disregard the dysfunctional solutions for a moment and focus on fingerprint readers and other biometric authenticators that work in theory. It’s important to understand their technical function to see why they’re a privacy scandal waiting to happen: whatever it is that is being scanned (fingerprint, retina, whatever) must first be converted to a set of numbers, and then, that set of numbers can be compared to a specimen that tells the computer what the set is supposed to look like if it’s the proper you who are trying to login.

Every computer system can be assumed to be vulnerable in some way. There simply isn’t such a thing as an unhackable system. In security, it’s said that the only secure computer is one that’s powered off, completely unplugged, taken to an unknown location, locked in a safe, and then buried beneath ten feet of hardened concrete – and even then, you can’t be entirely sure. In the case of biometric autentication, the interesting phase of the login process is when the fingerprint (or retina) has been read by the computer, and converted to a set of numbers (a “signature”), but before it’s compared to your specimen. What happens if – no, when – an adversary gets hold of the set of numbers that represents your fingerprint?

In security, this is known as a “replay attack”. We can see it in some old movies, when somebody enters a passcode on the kind of keypad that makes sounds as numbers are pressed, and somebody holds an old tape recorder nearby enough to catch the passcode melody. Then, the movie protagonist walks up to the door and replays the melody, opening it. It doesn’t quite work like that with door codes, but that’s the idea of a replay attack.

You know what happens when your password leaks, and you have to change it?

Imagine what happens when your fingerprint signature leaks, and adversaries are able to impersonate you using your own fingerprint, replaying it to a fingerprint reader. What are you going to do then? Use your other hand, use another finger? What are you going to do after the tenth leak?

This disregards the fact that courts will be completely oblivious to the fact that fingerprint data can leak just like passwords. You’ll have somebody testifying that the entered fingerprint data matches person X exactly, and the court will trust the technology and the experts.

This problem domain is something I haven’t even seen discussed seriously in the biometric domain, which is why I won’t invest in biometric authentication – from where I’m sitting, it looks like a privacy scandal just waiting to happen.

An even worse example is the biometric armband Nymi that stores your entire bitcoin wealth and uses your individual heartbeat signature to give you access to it, to give you access to your own money. This begs two questions:

1) What are you going to do if your heartbeat signature leaks, and you need to change the signature? Get a heart transplant?

2) What are you going to do if you have a heart attack (that you survive) and your heartbeat signature changes involuntarily as a result, when you’re locked out of your own bitcoin vault?

Our society is still far too blind to privacy implications of new technology, and doesn’t take them seriously.

In the meantime, privacy remains your own responsibility.