Considerations for biometric use

Biometrics are a fine way to do authentication, but they need to handled carefully.

This Content Component encountered an error

Biometrics are a fine way to do authentication, but they need to be handled carefully. There are situations where biometrics are an excellent way to authenticate, and other situations where using them can cause embarrassment of various sorts.

Here are some considerations:

Biometrics are all probabilistic. Unlike a password or a mathematical system, a biometric is an image of real-world data that is compared to another image to see if they're close enough. When I type my password into Amazon.com, for example, it's either right or wrong. I can't type in some other password and the server will say, "Ummm, close, but sure." Biometrics are always producing an answer of "close enough" or "not close enough" instead of, "You got it!" or "Nope, that's not it."

Intuitively, we know about these situations. There are aphorisms about close being acceptable for horseshoes, hand grenades and so on. This means that the system design has to take that into account. There can be simply "collisions" -- by that I mean people who happen to be close enough.

Here's an anecdote: Several years ago, a colleague and I got a demo of a voiceprint system. The demo was on a laptop that asked users to say, "Open the door." If users were authenticated, a picture of a door on the screen swung open. A red light flashed if the user got it wrong. I tried it a couple of times, getting it wrong. Then my colleague gave it a try, and on his third attempt the door swung open. This is an example of collision, and it is going to happen in any probabilistic system. In this particular case, both the vendor giving the demo and my colleague are Canadians, so we had a lot of jokes about all Canadians sounding alike.

Joking aside, how do you guard against collision? Many systems use an additional PIN as a differentiator. For example, if the person trying to get in types a PIN of 1234, you only have to compare the biometric against ones with the same PIN. If you happen to match another similar reading with a different PIN, you're not even going to compare the two. It's simple, and it works well. But, it does raise another question: If you're going to require a PIN, why even bother with the biometric? There are a number of good answers to this question, especially if price is not an issue. But if it is, a PIN is cheaper.

Before you harumph, go read this article about research done by the German magazine c't on how easy it is to fool a variety of biometric scanners. (It's relatively easy to fool a number of fingerprint scanners by putting Scotch tape on your finger; the scanner reads the warmth of your finger and the residual skin oils from the previous person.) Some of these attacks can be thwarted by something like a PIN, but this article concludes that serious biometric use must be supervised by a human being.

Not everyone has one. There are obvious aspects of this problem. An iris scanner is a great thing to use for letting a pilot onto an airplane, but it doesn't work as well if the door in question might be used by a blind person. Less obviously, a system like a face scanner doesn't work so well with men with facial hair, people with glasses, unusual headdresses including a nun's habit, a Sikh's turban or a Muslim woman's garb (from scarf to veil).

Even less obvious, some people nominally have a biometric, but it isn't usable. For example, a small minority have hands that are too dry or cold for fingerprint scanners to get a detailed reading. This means that either you have to use a different system or just accept the vague reading, which somewhat defeats the purpose of the exercise.

Biometrics should never be used in a network. The huge, gaping danger of a biometric system is what we call a "replay attack." A replay attack is nothing more than sending old data to the identifier. This is reasonably easy to guard against in a definite system, but in a probabilistic system, the attacker can modify captured data that will be accepted.

Let's suppose you have a Web commerce site that accepts credit card information verified with a biometric. How does a Web site differentiate between a hacker with stolen info and the real purchaser?

Loss of a biometric sample is a catastrophe. This is closely related to the previous problem. Let's suppose that someone has a database of credit card numbers and fingerprints that verify them. (If such a thing does not exist, how are you doing biometric verification?) What happens when -- not if, when -- that database is stolen? If it were merely credit cards numbers, we could issue people new ones. It's annoying to do so, but it's much simpler than issuing all our customers new fingerprints and telling them never to use the old ones again. That's going to look great on the front page of The New York Times. It's going to be bad when someone drags up cranky essays like this one on the risks of such a thing. But, it's going to be even worse when the authors of such cranky essays are called as expert witnesses in the quite justifiable lawsuit that follows.

Now, there are innovative ideas on how to mix networks and biometrics. For example, the company Authentify has a voice authentication system that uses your telephone as a key. You give the authentication system a phone number to call you at, their computers call you and verify that you say the digits that they dialed. They can even save a recording for some suitable length of time in case there's a dispute. This is exceedingly clever, but it is far less a biometric system than an automated challenge-response system using voice recognition.

I'm strident on this issue of how bad an idea biometrics in a network can be, but not strident because I think they're a bad idea. I'm strident because I think properly used they're a great idea, but the risks of improper use of biometrics could devastate the entire industry. The problem is that all the obvious cool ideas are the ones with huge risks.

Using biometrics to identify a known person versus an unknown person. By "known" I mean a person who is trained into the system. By "unknown" I mean someone picked out at random. A face scanner that lets employees into a door is a fine system. A face scanner trying to pick bad guys out of a crowd is a whole different problem.

Biometrics don't mix well with stronger systems, like cryptographic systems. This is because these systems are definite, not probabilistic. Let me give you an example. I have a great little widget -- a USB flash memory disk drive with a fingerprint pad on it. You put your finger on the pad, and if it's you, the disk mounts. Even better, the manufacturer claims that the flash memory is encrypted. (No, they don't say how.) On first blush, this sounds like a great system to put sensitive files on. It does, even on second blush, but there are subtleties in it. There have to be.

Let's just think about the system. In it, it's going to have three major components: the fingerprint reader, the flash drive and the encryption. The subtle interaction is that the fingerprint reader must have some way to tell the encryption system that it is okay to decrypt. Being a probabilistic system, it can't derive the encryption key from the fingerprint. The encryption system has to store that key in some safe place.

Fair enough, but how would you attack it? Well, I'd figure out where the little "close enough" wire is. There must be some wire where the fingerprint system tells the encryption system (and the disk system) that it's okay to proceed. If I can find that wire, I can bypass the whole system. So, we don't need to attack the crypto or the fingerprint scanner. We just need to find that wire.

I estimate that this would be a fun weekend project for an electrical engineering student or an honors high school student.

Does this mean that my widget is useless? No. It's a great bit of added security. But if you get one of those and you want to put sensitive business data on it, I recommend you do what I did -- put a PGP (pretty good privacy) disk on the thing as well.

I hope my answer gives you some better insight into what I mean by a "host of problems." To sum up again, I don't think biometrics are a bad tool. I think they're good, but you just have to use them correctly. The problem is that the first things we think of are often the incorrect uses, not the correct ones.


This was first published in June 2003

Dig deeper on Biometric Technology

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close