Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Airports & face recognition

By Signal 11 in MLP
Sun Sep 30, 2001 at 11:19:03 PM EST
Tags: Security (all tags)
Security

TheRegister is reporting on the flaws of face recognition technology. Their numbers are 1 in 250 false positives. The Minneapolis international airport has 35 million people going through it per year. That means that 383 people per day would be wrongly identified as terrorists under the system. Analysis of the ramifications for civil liberties is left as an excercise for the reader.


ADVERTISEMENT
Sponsor: rusty
This space intentionally left blank
...because it's waiting for your ad. So why are you still reading this? Come on, get going. Read the story, and then get an ad. Alright stop it. I'm not going to say anything else. Now you're just being silly. STOP LOOKING AT ME! I'm done!
comments (24)
active | buy ad
ADVERTISEMENT

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Related Links
o reporting
o 35 million
o Also by Signal 11


Display: Sort:
Airports & face recognition | 38 comments (38 topical, editorial, 0 hidden)
False Positives (4.33 / 6) (#1)
by ucblockhead on Sat Sep 29, 2001 at 04:28:07 PM EST

False Positives are only a problem if the procedures in place for dealing with positives are substandard.

They may well be, if the procedures are put in place too quickly, without enough thought, but false positives are in and of themselves bad only if you are not properly dealing with them.

There is, of course, an ancient form of facial recognition technology. It is called "giving the guys at the counter pictures of wanted criminals". That generates false positives, too. Unfortunately, people who understand false positives from people think machines are perfect. That's the real trouble.
-----------------------
This is k5. We're all tools - duxup

RE: False Positives (4.00 / 3) (#7)
by Signal 11 on Sat Sep 29, 2001 at 07:05:41 PM EST

...if the procedures are put in place too quickly, without enough thought...

*cough* Politicians.


--
Society needs therapy. It's having
trouble accepting itself.
[ Parent ]

Yes (4.33 / 3) (#8)
by ucblockhead on Sat Sep 29, 2001 at 07:33:56 PM EST

Exactly. The trouble is putting systems into place to quickly, not "false positives". The flip side of "false positives" is, of course, "false negatives". Both are bad, and you never entirely get rid of either. And often (though not always), decreasing one increases the other.

Once interesting factoid: they've done studies on those airport "x-ray" (no longer really x-ray) machines, and they were able to improve things dramatically by intentionally introducing false positives. The idea being that if the human behind the screen sees one "positive" per hour, they are more likely to catch one then if they only see one a year. More importantly, if they see one positive an hour, and 99% are "false", they won't panic and overreact when they see one. It will be a normal daily occurance.


-----------------------
This is k5. We're all tools - duxup
[ Parent ]

Statistics and polygraphs. (3.66 / 3) (#2)
by claudius on Sat Sep 29, 2001 at 04:57:55 PM EST

Nice number crunching. Permit me to make a numerical, if somewhat tongue in cheek, counter argument:

Presumably these 1 in 250 false positives will be arrested, lynched, and/or they will learn not to attempt boarding an airplane in Minneapolis. Therefore, we can expect that after a reasonably short time the error rate of such a system will improve dramatically--it'll be 1 in 1000, then 1 in 10000, and eventually 1 in 1000000. The more we harass these sorry chaps with inopportune faces, the faster the system will improve. Who could argue against a test that provides such greatly enhanced security with only a 0.0001% chance of false positives--next you'll probably advocate that we get abolish DNA testing?

There is no pseudoscience so shady that it cannot be promoted by suitably perverse statistics. Since one in three USians lack even a rudimentary understanding of probability, it will no doubt be easy to sell such a system to the public. (Curiously, this line of reasoning would suggest why organizations like the CIA have a much lower rate of false positives on polygraph examinations than would be expected from scientific studies of the polygraph test).

Civil liberties? What are those?

Statistics & Assumptions... (4.00 / 4) (#6)
by Signal 11 on Sat Sep 29, 2001 at 07:02:58 PM EST

... are about the most dangerous things in the world to mix.

Presumably these 1 in 250 false positives will be arrested, lynched, and/or they will learn not to attempt boarding an airplane in Minneapolis.

380 people a day are "arrested, lynched", etc. - wrongfully, and you view this as an autonomous element to your analysis?

Therefore, we can expect that after a reasonably short time the error rate of such a system will improve dramatically--it'll be 1 in 1000, then 1 in 10000, and eventually 1 in 1000000. The more we harass these sorry chaps with inopportune faces, the faster the system will improve.

If only theft deterrent systems in shopping centers and stores worked that way! For every person that is wrongfully accused of shoplifting, it costs a business an average of $100,000 - this is from Home Depot's internal accounting (hence, not publicly available). You may take it (the number) with a grain of salt, if you wish, as I cannot "prove" this online. So, unfortunately, the system of "accuse until they go away" doesn't work very well in practice. Even if it did, it is ethically questionable.

There is no pseudoscience so shady that it cannot be promoted by suitably perverse statistics.

I disagree. To date, I've been able to refute almost every statistic generated by the National Organization of Women, one of the leaders in "shady pseudoscience". Unfortunately, most people do not understand statistics, and hence fall victim to the aforementioned. But that is not a fault of science, only human intelligence.

...this line of reasoning would suggest why organizations like the CIA have a much lower rate of false positives on polygraph examinations than would be expected from scientific studies of the polygraph test.

It might also suggest that would-be spies and wrong-doers have found a way to subvert the system, such as raising the baseline comparison by taking drugs, or causing themselves pain - which increases heart rate, etc. Passing a polygraph test is as easy as digging the tip of your toenail into your toe while sitting at the table.

As I said, statistics and assumptions are one of the most dangerous things to mix. However, I believe Bruce Willis put it best in Die Hard - "Assumption is the mother of all fuckups."


--
Society needs therapy. It's having
trouble accepting itself.
[ Parent ]

Lynched!?!? (4.00 / 2) (#9)
by ucblockhead on Sat Sep 29, 2001 at 07:44:55 PM EST

Who said "arrested"? Who said "lynched"? You are making big assumptions about what the response to a positive is. There is no way it would result in an immediate arrest. Most likely it would result in a "may I see your ID" and a visual check by a human being with access to the "most wanted" database.

Note that the more false positives there are, the less likely that positive results will be met with overreactions. If an airport security guy sees ten "false" positives a day and one "real" positive a year, that airport security guy is going to very quickly start taking these positives in stride, and assume that they are "false" until other evidence presents itself. Which is, of course, exactly what you'd hope for in such a system.

I'd be much more worried if there were only one "false" positive a year, worried because the security guard is much more likely to go apeshit when he sees one.

FWIW: I have personally twice triggered a "false" positive on the current baggage scanning machines with my backpack. Both times, I was taken to the side, my bag was given a once-over with some chemical detector, and then I was sent on my way. Everything was handled well, I was treated politely, and it took about twenty seconds of my time.


-----------------------
This is k5. We're all tools - duxup
[ Parent ]

RE: Lynched!?!? (1.50 / 2) (#10)
by Signal 11 on Sat Sep 29, 2001 at 07:53:54 PM EST

Who said "arrested"? Who said "lynched"?

You.


--
Society needs therapy. It's having
trouble accepting itself.
[ Parent ]

Please try reading more closely... (4.00 / 1) (#11)
by ucblockhead on Sat Sep 29, 2001 at 08:00:25 PM EST

No, claudius did and you did, and you are both making silly wild-ass assumptions about what result a "false positive" would bring.

The whole concept has nothing to do with how people are treated. Only the reaction to a positive is important, and psychologically speaking, more false positives will produce fewer overreactions to positives.
-----------------------
This is k5. We're all tools - duxup
[ Parent ]

Ahem - You idiot. (1.66 / 3) (#14)
by Signal 11 on Sun Sep 30, 2001 at 12:49:01 AM EST

Only the reaction to a positive is important, and psychologically speaking, more false positives will produce fewer overreactions to positives.

Uhh... right. And if they don't take every threat seriously, they'll be written up on it, because they need to be on alert for that one that does come up and is the real mccoy. Dig your head out of your ass - this is the 21st century, you have no rights.


--
Society needs therapy. It's having
trouble accepting itself.
[ Parent ]

Strategic intelligence (3.66 / 3) (#16)
by sigwinch on Sun Sep 30, 2001 at 02:32:39 AM EST

And if they don't take every threat seriously, they'll be written up on it, because they need to be on alert for that one that does come up and is the real mccoy.
Wrong, wrong, wrong. Strategic intelligence isn't about finding one single piece of perfect irrefutable evidence. It's about making guesses from correlations on large data sets. How about a couple of examples.

Example 1: Suppose the system finds a probable match for one person on a flight, and the match is to a CIA photo of a known member of a Saudi extremist group. So they look at his ticket info, and he has round trip tickets reserved six months ago for him, his wife, and his 12 year old daughter. A cursory examination of the luggage is consistent with a family vacation. There are no other probable matches on that plane. So they ignore it. They don't even hassle the guy over it.

Example 2: Suppose the system finds probable matches for two 'bad guys' on a single flight. They check the ticket info and find that their tickets were one-ways purchased two hours ago with cash. There are four other Arab-looking guys on the same flight who also purchased one-ways with cash that morning. None of them have checked baggage, and they have minimal carry-on. When questioned by security they appear extremely nervous, and resist searches of their bags. A cross-check with the police arrest mug-shot database shows a probable match to a man who was arrested four days ago in a fight started when he expressed anti-American sentiments. Another flight at the same airport has a similar pattern. Bingo.

It's just another part of a system for gathering and correlating information. Alone, most pieces of information are nearly meaningless. It's only in combination with other pieces that there is meaning. Frankly, it's nothing unusual. Airport security forces already routinely do correlations to target certain people for extra attention. Face recognition would be just another tool in their considerable toolchest.

I was surprised to see Bruce Schneier calling face recognition worthless. One of his mantras is 'defense in depth': you don't try to build one perfect access control point, but rather layers of controls, checks, and auditing. This is just another checking layer.

As for this being oppresive or a violation of your liberty, they're already checking your purchase records; correlating the current ticket with other ticket purchases if you used a credit card; considering your race, clothing, hair style, and demeanor; taking detailed x-ray photographs of your luggage; and perhaps physically searching your luggage. They already do all that without raising too much havoc, but we're supposed to believe that they're magically gonna start oppressing people if their computer looks at your face?

Dig your head out of your ass - this is the 21st century, you have no rights.
At least *try* to think through the strategic value of something before getting all rude about it.

--
I don't want the world, I just want your half.
[ Parent ]

Why false positives defeat security (5.00 / 1) (#29)
by dennis on Sun Sep 30, 2001 at 04:49:40 PM EST

In Secrets and Lies, Schneier describes how to defeat motion detectors at military bases. Prior to breaking in, you just throw a rabbit over the fence. The guys on the base drive out to investigate, don't find anything, go back in. Next night you throw another rabbit. Keep doing that until the base personnel quit driving out. Then cut the fence and drive your jeeps in.

This isn't hypothetical - guerilla fighters have used this tactic with great success (he mentions where, I don't remember).

Incidentally, given the way four attacks were coordinated, I'd be willing to bet that the terrorists bought their tickets well in advance. If they're as well-trained as everyone says, they'll buy round-trip tickets with credit cards and check some luggage, just like everyone else.

[ Parent ]

Defense (none / 0) (#34)
by ucblockhead on Sun Sep 30, 2001 at 06:38:12 PM EST

The way for the defenders to guard against this behavor is for the base commander to pick random nights, throw rabbits over the fence, and then bust the hell out of the guards if they don't properly investigate.


-----------------------
This is k5. We're all tools - duxup
[ Parent ]

Re: Why false positives defeat security (4.00 / 1) (#36)
by sigwinch on Sun Sep 30, 2001 at 11:05:56 PM EST

Next night you throw another rabbit. Keep doing that until the base personnel quit driving out. Then cut the fence and drive your jeeps in.
This merely proves that an operation plan that assumes a low false positive rate can be defeated by increasing the false positive rate. As ucblockhead points out, if the operation plan is designed with the real false positive rate in mind, it can be successful.

Moreover, a defense system that relies exclusively on a single layer of motion sensors is pretty much doomed anyway: it violates the principle of defense-in-depth.

This isn't hypothetical - guerilla fighters have used this tactic with great success (he mentions where, I don't remember).
My copy of S&L is at work so I can't check, but I've heard similar stories about resistance efforts eastern Europe (Poland?). The locals put upside-down pie pans on roads used by military traffic. At first, each pie pan was treated as a potential mine and carefully disposed of. After a while though, they got tired of doing that and started ignoring the pie pans. Then the guerrilla resistance started putting real landmines under pie pans, and took out a lot of the enemy.

Again, this proves nothing about the impossibility of dealing with false positives. South Africa was faced with a similar challenge regarding mines. Their solution was to develop vehicles with armored undercarriages and wide steel wheels that could survive driving over mines.

Incidentally, given the way four attacks were coordinated, I'd be willing to bet that the terrorists bought their tickets well in advance. If they're as well-trained as everyone says, they'll buy round-trip tickets with credit cards and check some luggage, just like everyone else.
If I remember right, the recent attackers were not very smart about this (one way tickets, cash for some tickets, several tickets on the same credit card, little luggage).

I don't expect for future attacks to be tremendously better. Suicide soldiers have to be narrowly educated, single minded, and not very curious. An educated thinker will start to get ideas and realize the strategic futility of throwing their life away in a single attack. Kind of like bank robberies: the people who could them well and profitably have the sense not to bother.

--
I don't want the world, I just want your half.
[ Parent ]

You know what? (3.00 / 1) (#37)
by i on Mon Oct 01, 2001 at 05:22:52 AM EST

I'm personally involved in driving around national borders and military installations, checking for intruders. Yes, 99.999999% of the time it's a rabbit (actually a pig or a porcupine; rabbits are too small to set off the detectors). We drive out each time none the less. I might add that all this happens in Israel. Other peoples' situation is (was?) different, so is the attitude towards such things.

and we have a contradicton according to our assumptions and the factor theorem

[ Parent ]
Sigh... (none / 0) (#22)
by ucblockhead on Sun Sep 30, 2001 at 11:48:48 AM EST

You know, if you thought a bit instead of letting your knee jerk and responding to contrary information with cries of "idiot"...

As I've said, there have been studies that show that artificially increasing the false positive rate resulted in better security at airline gates.


-----------------------
This is k5. We're all tools - duxup
[ Parent ]

Survey says... (1.00 / 1) (#25)
by Signal 11 on Sun Sep 30, 2001 at 12:39:58 PM EST

Survey says, 9 out of 10 researchers prefer citations.


--
Society needs therapy. It's having
trouble accepting itself.
[ Parent ]
It was in Science News a while back. (2.25 / 4) (#26)
by ucblockhead on Sun Sep 30, 2001 at 02:31:55 PM EST

It was in science news a few years ago. Their online archives don't go far back enough to post a link.

It is, of course, a patently obvious conclusion, but I suppose if you are too busy typing witty comebacks, you might not notice.


-----------------------
This is k5. We're all tools - duxup
[ Parent ]

Thanks for the 0, siggy (none / 0) (#31)
by ucblockhead on Sun Sep 30, 2001 at 05:36:57 PM EST

It is always nice to see one's opinions about a person confirmed.
-----------------------
This is k5. We're all tools - duxup
[ Parent ]
Fine (5.00 / 1) (#32)
by ucblockhead on Sun Sep 30, 2001 at 06:25:59 PM EST

Here's an article from the Houston Chronicle that discusses the system I am talking about:

The FAA this year is deploying 1,380 new X-ray units to screen carry-on baggage. The new machines will randomly show false images of guns and explosives on the screen. The idea is that the images will help the screeners better recognize real dangers, and keep them alert.
Here the folks at Jane's talk about the system going into place.

The second feature is the Combined Threat Image (CTI) where images of whole bags, including the threat, are inserted in between actual bag images. This allows senior managers to test the alertness of their security personnel under operational conditions.
I can't link to the original article because I read about it in the dead-tree version of Science News, and they don't put all of their articles on the net.

I can give you one cite, though:

20000011523 Federal Aviation Administration, Office of Aviation Research, Atlantic City, NJ USA
Test and Evaluation Plan for Measuring Checkpoint Effectiveness and Efficiency
Klock, B. A.; Fobes, J. L.; Jun. 1999; 44p; In English
Report No.(s): PB99-163677; DOT/FAA/AR-99/51; No Copyright; Avail: CASI; A01, Microfiche; A03, Hardcopy

Is that a good enough "cite" for you, or are you just gonna hand out a "0 - inconvient info"?
-----------------------
This is k5. We're all tools - duxup
[ Parent ]

One in a 100k (4.66 / 3) (#17)
by Znork on Sun Sep 30, 2001 at 03:44:22 AM EST

One real terrorist in a hundred thousand false beeps. Yes, security probably wont make a fuzz eventually. To the point where they take a glance at the real terrorists false passport, pat him down a bit halfheartedly and lets him go along, while kicking the idiotic machine yet again.

The only thing these systems are good for in a setting like this is wasting money. Dont get me wrong, face recog can be useful for a lot of things, but comparing massive number of pictures in dubious conditions against a large database isnt one of them.

[ Parent ]
How does one prove his innocence? (4.50 / 2) (#21)
by claudius on Sun Sep 30, 2001 at 10:14:46 AM EST

You seem to believe that this system is nothing worse than when the chemical sniffers cause you to have to stop and open your bag for inspection. I believe it is different, however--it is rather more difficult and time consuming to open up and vet a person. If the system is to have any efficacy at all, I submit that it must be intrusive.

Let us say for the sake of argument that I have a guilty looking face, that I resemble a known IRA bomber. My face is a good enough match that it triggers face recognition systems essentially everywhere I go. My question now is how do I prove that I am not said bomber? Effectively, the burden is now upon me to prove my innocence, and not upon them to prove my guilt. (Technically, it is still the other way around, but we are not discussing courts of law here--we are talking "will I be allowed to make my connection"). This is no big deal with a suspicious bag--just turn on the laptop or let them rummage through the bag. You lose two minutes of your time, tops. The challenge is much greater proving, to their satisfaction, that a suspicious person such as I has honorable intentions and should be permitted to travel.

Essentially, the problem boils down to my proving my identity. Do I show my passport? Certainly most terrorist cells have the resources to obtain fake passports--the 9/11 hijackers did. Shall I show some other proof of identification--a driver's licence perhaps? Surely these are easy to obtain as well, expecially once I have a forged passport. Shall I submit myself to a detailed interview regarding who I am and what the purpose is for my travel? This will help, but it is hardly foolproof. More accurate, yet time consuming, would be to use biometrics in some fashion--perhaps I should allow myself to be fingerprinted and provide DNA samples so they can check them against their databases? I'm sure the lab work can be finished in a few days, during which time I may be obliged to wait in their custody. If I need to do this every time I travel, how long before I just stop bothering or before I take countermeasures (grow a beard/get plastic surgery) to avoid the hassle? Either way the number of false positives registered by the system decreases and the apparent success rate of the system will improve.

Lest you think that this is outrageous, that a democracy would never allow such an invasion of privacy, consider that we already do this routinely at our borders. Coming back from a joint meeting of the American Physical Society and the International Congress of Plasma Physics in Quebec City last Fall, my former officemate was detained for an afternoon at the Canada-U.S. border because (we have good reason to believe, based on the comments and questions of the interviewers) they suspected that he was a spy on the grounds that he held a Malaysian passport, was of Asian descent, and worked in the U.S. at an organization where spying is a concern. He was obliged to submit to numerous interviews, have his bags checked and rechecked, and then explain in detail every scientific paper and presentation in his possession before it was determined that he should be allowed passage into the U.S. He missed his connecting flight and his trip home was delayed an extra day and a half. His crime? He "fit the profile" of a spy, just as I may fit the profile of a hijacker/terrorist.

Of course you may be right--the face recognition software may merely be another meaningless travel ritual like the infamous "two questions." But then what's the point?

[ Parent ]
Bullshit (none / 0) (#23)
by ucblockhead on Sun Sep 30, 2001 at 12:08:51 PM EST

ffectively, the burden is now upon me to prove my innocence, and not upon them to prove my guilt.

That is completely bullshit.

You are making all kinds of unwarrented assumptions about how the system is going to be used.
-----------------------
This is k5. We're all tools - duxup
[ Parent ]

Blind trust and "secret evidence" laws. (5.00 / 1) (#28)
by claudius on Sun Sep 30, 2001 at 02:58:55 PM EST

You are making all kinds of unwarrented [sic] assumptions about how the system is going to be used.

As are you. Which of us is right?

I am not wont to trust implicitly in the U.S. government even (perhaps especially) when it involves constitutionally granted rights. Case in point: ever heard of the "secret evidence" laws? As currently practiced, the accused may be held for months to years without ever learning or being able to answer to the charges brought against them. Their due process is suspended, and the burden of proof is undeniably upon the accused to prove their innocence. Unconstitutional? Yes, almost certainly, but the laws are still on the books, despite W's assurances to the Muslim-American communities during the 2000 campaign that he would work to repeal the laws. (Nearly everyone held on "secret evidence" is an ethnic Arab).

You seem to believe that no such abuses can occur in the much-vaunted American democracy, but in my experience almost every opportunity for scrutiny, clandestine and otherwise, has been exploited and abused by law enforcement. Polygraphs, wire taps, racial profiling, drug-war suspension of search-and-seizure restrictions, reclassifying evidence to deny the accused its access, whatever. The more scrutiny there is in society, the easier it is for law enforcement to protect it, but the easier it is to wind up with a society not worth defending.

I can love my country and still not trust it. This makes me no more of a "bullshit" spewer and no less of a patriot than the flag-waving zealots, and by participating in the national debate democracy is better served.

[ Parent ]

Sigh.. (3.00 / 1) (#30)
by ucblockhead on Sun Sep 30, 2001 at 05:35:14 PM EST

Lots of abuses are occurring, my dear Claudius, which is exactly why we should be talking about the real abuses like Carnivore rather than going off all half-cocked about something someone might do years from now, if we assume the worst.


-----------------------
This is k5. We're all tools - duxup
[ Parent ]

Useless for terrorism, useful for ..... (5.00 / 5) (#3)
by Blarney on Sat Sep 29, 2001 at 05:00:22 PM EST

When you get a "positive ID" off this system, the odds on it actually being a terrorist are pretty low. If you're getting about 300 hits a day, but you'd actually be lucky to see half a terrorist a year in your airport(just made up numbers for an example), that means that somebody who gets identified by this machine is NOT a terrorist, to a 99% certainty level.

A system like this can't be used to accurately determine rare events. It can be used to pick out more common things, because the false positives are lost in the noise. The common "piss test" for drugs is only tolerated because such a large amount of people actually do smoke marijuana - were marijuana smoking a rare, abberant behavior of less than 1% of the population, the test would pick out nonsmokers most of the time and be considered useless. As it is, the errors are masked in the actual useful results.

Conclusion - this test is useful, but it won't be for terrorism. It will be used to pick up people who are in violation of various orders of the court. It will be used to find people who didn't pay their parking tickets, missed court dates, trials, sentencing hearings, who didn't report to prison when told to, who wouldn't or couldn't pay their child support, who didn't return the child to their exwife on time. Innocent until proven guilty isn't required when dealing with wanted fugitives - and there are a great many fugitives out there.

Unfortunately... (none / 0) (#18)
by Znork on Sun Sep 30, 2001 at 04:09:45 AM EST

The false positive rate rises with every extra face you add to the database of people you want to catch. Add enough people to the database and eventually everyone who passes the faceid system will be flagged because they're within a 70% margin from some facial biometric stored in the database.

Still not useful.

Facial ID systems are good for attempting to make sure someone is who they say they are, because then they will point their faces in the right direction, the system will be set to be sceptical about what grade of match it accepts, and they can try a few times until it gets close enough.

They are also good for situations where you have a photo of someone you wish to ID, and you can set the system to be fuzzy about matches and give you a few possible matches where you can pick the best matches yourself.

But they absolutely SUCK when you compare millions of faces against tens of thousands of faces in a database every day under dubious conditions. You will only get huge amounts of crap data out of it.

[ Parent ]
Waitaminute... (5.00 / 1) (#38)
by jabber on Mon Oct 01, 2001 at 12:49:24 PM EST

What you're saying is essencially that the test itself is useless, and that it's 'accuracy' relies on the fact that, if it's scope is carefully adjusted, there will be high probability that anyone fingered by this system will have some sort of violation on record..

To say it another way, the tests effectiveness will be propped up by the fact that any randomly chosen individual has at some point done something wrong..

So why not save money and not develop this system? Instead, capitalize on the old urban legend and attach a colander to an old copier with a sheet reading "Guilty" in the machine.. Put the colander on the head of everyone going through the metal detector, and push the 'copy' button. When the "Guilty" verdict comes out, force the 'suspected terrorist' to confess to their overdue parking ticket, and that's it.. You can collect revenue on the spot, and all but the best trained 'real' terrorist will lose composure to the point of warranting detention.

In a nutshell, we're all guilty of something. Broadenning the 'test' parameters to include all people will make the 'test' 100% effective at finding suspects.

[TINK5C] |"Is K5 my kapusta intellectual teddy bear?"| "Yes"
[ Parent ]

The flipside: false negatives (4.66 / 6) (#4)
by localroger on Sat Sep 29, 2001 at 05:33:02 PM EST

The more important quantity, which nobody in the nascent face recognition industry likes to talk about, is the frequency of failed identification. You can deal with false positives with procedures and so on but all it takes is one false negative to kill a lot of people.

As I reported awhile back here some friends of mine regularly subject themselves to the scrutiny of facial recognition software in casinos. They have, on rare occasions, been caught this way. They have more often played with impunity at places we *know* use the technology.

Upshot is, the shit just doesn't work worth a damn. Considering how hard it is for a human being (who is a lot better at it than a computer) to reliably recognize faces, it probably can't be made to work.

I can haz blog!

Failures at casinos (4.00 / 1) (#15)
by sigwinch on Sun Sep 30, 2001 at 01:48:00 AM EST

As I reported awhile back here some friends of mine regularly subject themselves to the scrutiny of facial recognition software in casinos. They have, on rare occasions, been caught this way. They have more often played with impunity at places we *know* use the technology. Upshot is, the shit just doesn't work worth a damn.
Casinos don't really proove anything: even if the technology was 99.99999999999999999999999999% accurate, they could turn it off and *still* rake in profit hand over fist. They have almost no incentive to buy good stuff, properly install it, properly manage the face databases, properly train the operators, or even pay attention to the results.

--
I don't want the world, I just want your half.
[ Parent ]

On the contrary (none / 0) (#19)
by localroger on Sun Sep 30, 2001 at 08:27:25 AM EST

They have almost no incentive to buy good stuff, properly install it, properly manage the face databases, properly train the operators, or even pay attention to the results.

There is a wide range of attentiveness in casino security land, ranging from out and out indifference to rabid all-stops-out surveillance. We have a pretty good idea which properties are doing what at any given time.

Make no mistake, the casinos that are trying facial recognition are using the very best state of the art technology, because they have a practically infinite amount of money to throw at schemes like this when they want to. (We have direct knowledge of this through turncoat security operatives. Among other things, the casino environment breeds disgruntlement.)

The reason the technology isn't working for them is that it doesn't work. Yeah, it can tell you're the boss as opposed to the janitor, but it can't tell for certain that you're the boss as opposed to all of the other 100,000 people who came to see the Super Bowl. In order to tell that you *might* be one of the 50,000 people in the Griffin database it needs a calibrated photograph from the right angle and with the right lighting (and in practice the software will still ask the human operator the sex and race of the subject). This involves a lot of human effort to get the picture into the system, and it still turns up multiple matches.

Security experts are passing off a fairy tale that you can scan everyone walking into the airport or casino and have a buzzer go off when Osama buys his ticket. There is no system that works that way, not in practice or even theoretically.

Your point about the casinos having no incentive would be true if casinos were rational about their place in the world. But they're not. A few properties are reasonable about it, but some (think "Hilton") are unbelievably paranoid about card counters and spend far, far more on schemes to stop them than they would lose if they just tightened up the games and let counters play. No, it doesn't make sense, but not much in casinoland does. We know people who have posed as potential customers and gotten the sales pitch on the exact same systems in use at certain casinos. This stuff is the state of the art.

And believe me, if they have as much trouble as they do scanning the limited number of black chip blackjack players, the tech will be entirely worthless in an airport. It will catch a fraction, false-positive a much, much larger fraction, and the fraction that gets through will hijack your plane.

I can haz blog!
[ Parent ]

The article (none / 0) (#33)
by ucblockhead on Sun Sep 30, 2001 at 06:35:03 PM EST

A key paragraph in the article, that no one here is discussing, is this one:

"This means that under controlled circumstances....you could expect one false positive out of 250 people when face recognition is used alone," FaceKey COO Annette Starkweather told The Register. "FaceKey has combined face recognition with fingerprint recognition to [achieve] a FAR of one in 2.5 million," she added.

In other words, the company that is trying to sell this (and these "security experts" are really just salespeople) is meant to include a first tier using facial recognition and a second tier using fingerprints, and is claiming that this combined system has an extremely low error rate. (Claiming. Who knows if it is true?)

That's what a lot of people seem to be missing here. They seem to thing that a triggered positive is going to cause sirens and gun-wielding federal marshals. Most likely, it will result in "please look directly in the camera" followed by "please put your fingers on this scanner".

Whether or not people will accept being fingerprinted at airports is an interesting question.

(It should be obvious that casinos aren't going to start fingerprinting customers, so what you say is completely true.)

Personally, I find the use of such systems by private organizations like casinos orders of magnitude more frightening than the use of such systems by law enforcement.
-----------------------
This is k5. We're all tools - duxup
[ Parent ]

Depends on what the response to positives is... (4.80 / 5) (#5)
by khym on Sat Sep 29, 2001 at 05:34:25 PM EST

If, say, all positives were given a quick pat-down, were scrutinized by a human who has a list of pictures of known terrorists, a quick rifle through their carry on luggage, and point the bomb-sniffing-dogs at their checked-in luggage... Sure, it would slow things down, and make the tickets cost more, but an airline is a private business, and it's not like there's any sort of right to be un-searched when flying. Of course, you could refuse to be searched, in which case they won't let you on the plane.

On the other hand, if, for all positives, they aim a gun at you, yell "Freeze!", throw you to the ground, and handcuff you...



--
Give a man a match, and he'll be warm for a minute, but set him on fire, and he'll be warm for the rest of his life.
Got this at slashdot (3.50 / 2) (#12)
by wiredog on Sat Sep 29, 2001 at 09:17:21 PM EST

An article by Bruce Schneier

If there's a choice between performance and ease of use, Linux will go for performance every time. -- Jerry Pournelle
Still Deciding (3.00 / 2) (#13)
by DigitalRover on Sat Sep 29, 2001 at 11:00:25 PM EST

But I'm pretty certain of what my decision will be. Whenever the question of sacrificing freedom for safety comes up, we need to think long and hard on our answers.

On the one hand, I am definitely for rooting out those who seek to do harm to myself and my fellow citizens[0]. On the other hand, not only is the system far from infallible, but how can you have a correct positive on someone who has never been put into the system? If a would-be hijacker has come into the country and has never been identified the fanciest, most fool proff system won't be able to flash a warning on the screen that says "Would be terrorist." On the gripping hand, though, properly trained and outfitted security personnel within the airports can stop the flow of most weaponry onto the planes. Sure, someone might be able to smuggle in a glass or plastic weapon that could do serious harm but we now have a public that is much more likely to fight back on the planes. Couple that with a reasonably secure cockpit and you can prevent another attack of the magnitude we have seen.

Better yet, why not allow the pilots to be armed. Frangible bullets can stop an attacker without tearing apart the plane.

There are ways we can secure our flights, facial recognition technology isn't one. . .

[0] Even you "crazy left wingers."

Read again - 1 in 3 in 'uncontrolled' environment (4.75 / 4) (#20)
by arheal on Sun Sep 30, 2001 at 08:54:56 AM EST

Go back and read that article again. The 1 in 250 is for perfect conditions. In a uncontrolled enviroment (a camera pointing into a crowd for instance) the false positive rate is 1 in 3. Yes Virginia, every third person is identified as a terrorist.
There can be only one!
If so... (4.00 / 2) (#27)
by ucblockhead on Sun Sep 30, 2001 at 02:34:49 PM EST

Then the system is useless in practice, and therefore not a threat to civil liberties.
-----------------------
This is k5. We're all tools - duxup
[ Parent ]
Yes it is (5.00 / 1) (#35)
by loucura on Sun Sep 30, 2001 at 07:23:17 PM EST

Although, not in the manner that the story poster thinks, this is a violation of civil liberties.

Afterwards, they can still haul in a nice profit selling incremental 'upgrades' to victims who've invested millions and can't justify backing out; and for an added bonus, they will have become the 'DoubleClicks' of public biometric data, which is sure to be a gold mine in itself.

A violation of civil liberties, a la DoubleClick.

[ Parent ]
You know... (5.00 / 3) (#24)
by ucblockhead on Sun Sep 30, 2001 at 12:13:59 PM EST

You know, this submission makes it sound as if this system is going to be installed next week, whereas if you read the article in question, it is obvious that they aren't even past the basic testing phase.

Perhaps (as an exercise for the reader) we might as ourselves which is more important to worry about, systems that might be installed someday, and might not work, and might be used poorly, or systems that congress is looking at right now.

Seems like a "boy who cried wolf" thing. The danger is people listen to us even less.
-----------------------
This is k5. We're all tools - duxup

Airports & face recognition | 38 comments (38 topical, 0 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!