Hide Three days left! Support our fundraiser by January 5th and get the first BestNetTech Commemorative Coin »

Facial Recognition Still Struggles To Recognize Faces As More People Are Misidentified As Criminals

from the everything-is-fine-say-tech-purveyors-surrounded-by-flames dept

We’ve long been critics of facial recognition tech here at BestNetTech. Even though the steady march of technology inevitably means the tech will get faster and better, the problem is the first part: faster.

The tech has proven to be very fallible. And it has made things even worse for the sort of people most often targeted by cops: minorities. Pretty much every option offered by facial recognition tech purveyors performs at its worst when dealing with anyone who isn’t white and male.

So, the people who have spent their entire history being the target of biased policing efforts have seen nothing improve. Instead, tech advancements have, for the most part, simply automated bigotry and provided cop shops with plausible deniability for their own innate racism. “The machine made me do it.”

The UK — especially London and the area overseen by the Metro Police — was an early adopter of this tech. The government had already blanketed the city with cameras, putting it on par with China and India in terms of always-on surveillance of its residents.

The private sector was ahead of the curve on facial recognition tech adoption. Early concerns were raised about rights violations, but most of those issues simply didn’t apply to business owners and their cameras. The influx of cameras and add-on facial recognition AI has only increased the opportunity to falsely accuse people of crimes and/or violate their rights (if the government is involved).

And so it goes here in this recent report from the BBC, which details a few more instances where people have been converted to criminals by software screwups.

Sara needed some chocolate – she had had one of those days – so wandered into a Home Bargains store.

“Within less than a minute, I’m approached by a store worker who comes up to me and says, ‘You’re a thief, you need to leave the store’.”

Sara – who wants to remain anonymous – was wrongly accused after being flagged by a facial-recognition system called Facewatch.

She says after her bag was searched she was led out of the shop, and told she was banned from all stores using the technology.

While this may seem somewhat innocuous when compared to false arrests and bogus criminal charges, it’s far from harmless. While Sara may still have other shopping options, this false flagging may have prevented her from using her favorite — or most convenient — option.

That’s not nothing. That’s a private company making a decision based on flawed tech that can heavily alter the way a person lives and moves around. And since it’s often not immediately clear which multi-national conglomerate owns which retail store, people dealing with this sort of ban can unintentionally violate it just by heading to Option B. And repeat violations can likely bring law enforcement into play, even if the violations were entirely unintentional.

But the UK’s grand experiment is still harming people the old way, with additional harassment, duress, and invasive searches predicated on little more than who some tech product thought a person walking by a camera resembled.

Mr Thompson, who works for youth-advocacy group Streetfathers, didn’t think much of it when he walked by a white van near London Bridge in February.

Within a few seconds, though, he was approached by police and told he was a wanted man.

“That’s when I got a nudge on the shoulder, saying at that time I’m wanted”.

He was asked to give fingerprints and held for 20 minutes. He says he was let go only after handing over a copy of his passport.

But it was a case of mistaken identity.

Sure, these might be anomalies, given the sheer number of facial recognition tech options being deployed by the UK government and any number of private companies that call that country home. But, again, the fact that it’s so common means these experiences are bound to be far more common than they might be in areas where deployments are more limited or subject to better regulation.

Live facial recognition — the tech responsible for this blown call — still remains a relative rarity in London. The Metro Police only used it nine times between 2020 and 2022. But in 2024, it had already used it 67 times, which makes it clear the plan is to steadily increase use. And that number only covers deployments. It says nothing about how long people were subjected to live facial recognition, nor how many faces were scanned by the tech.

The Metro Police claim any concern about false positives is misplaced. According to the Metro Police, the false positive rates is one per 33,000 people who come in range of its cameras.

But that’s not a good excuse for subjecting people to flawed tech. First, it says nothing about false negatives, which would be every time the tech fails to flag someone who should be flagged as a suspected criminal.

Furthermore, the percentage of false positives skyrockets when people are flagged by the live AI system:

One in 40 alerts so far this year has been a false positive.

That’s an insanely terrible error rate. These are the “hits” that matter — the ones that can result in detainment, arrest, questioning, searches, and other applications of government force against someone a computer said was someone officers should subject to any number of indignities.

For now, a lot of live facial recognition is being deployed by easily identified mobile police units, usually via “unmarked” white vans. Criminals who fear being spotted may simply choose to avoid areas where these vans are found or steer clear of camera range. If that’s how it’s being handled, it’s highly unlikely the public safety gains outweigh the collateral damage of a 1-in-40 error rate.

Worse, the Metro Police may realize its surveillance tech is no longer useful when it’s being carried around in easily recognizable vehicles. At that point, it may start angling to add this tech to the thousands of cameras the government has installed all over London and other areas of the UK. And when it becomes standard operating procedure for thousands of cameras, the error rate may remain the same, but the number of false positives will increase exponentially. And once that happens, the anomalies will be so numerous, it will be difficult for the government to pretend it isn’t a problem. But by that point, the tech will already be in place and that much more difficult to curtail, much less root out entirely, if the systemic failures prove to be too much for the public to accept.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Facial Recognition Still Struggles To Recognize Faces As More People Are Misidentified As Criminals”

Subscribe: RSS Leave a comment
26 Comments
Anonymous Coward says:

Re:

So does private enterprise, though, as shown by the first example.

These false positive rates could be reduced through proper representation for the victims. Pain in a pocket book makes for a great incentive.

Unfortunately, the cause and the pain are too disconnected in the usual government cases.

That Anonymous Coward (profile) says:

If we don’t do this the terrorist will win!
If we don’t accept our rights being removed, the terrorists will win!
A “few” false positives are nothing to be concerned about, just because you were required to produce your papers to prove your identity b/c the often wrong AI is treated as a perfect god.

Obligatory – https://www.reddit.com/r/comics/comments/d1sm26/behold_the_ultimate_life_form/

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

And it has made things even worse for the sort of people most often targeted by cops: minorities.

Minorities are “targeted by cops” because they’re scientifically-proven to have low impulse control and a predilection for criminality.

Why do you think White people are willing to suffer high property taxes?…Because they understand it’s effectively a tax that usually keeps negroes and Hispanics out!

Anonymous Coward says:

Re:

“because they’re scientifically-proven”

This phase is used so much in advertising that it has become meaningless to many.

Another bullshit phrase that is over/mis used is: clinically proven.

If one wants to look really smart, they use the term: in vitro.

Then there are the graphs shown to prove their claims, the graph has not units of measure and simply shows lines going up and down – lol.

Anyways .. advertising is bullshit and so is your post.

Anonymous Coward says:

[Sara] says after her bag was searched, she was led out of the shop and told she was banned from all stores using the technology.

That’s pretty bad. I recently stole a magazine costing £4.75 from Tesco,* but what’s the betting I’m not banned from shopping there or anywhere else.

*Totally accidentally, and as soon as I realised, I got the taxi I’d ordered to wait for me whilst I went back to apologise and pay for the item.

TasMot (profile) says:

I Need a New Hat

Hats, umbrellas and parasols need to make a comeback. Some sort of wide brimmed hat for men, and parasols for women need to make a comeback as a passive protest to the alway on cameras.

Or, maybe some new facial jewelry that can be changed everyday. Stars, moons, and other facial lucky charms would probably send facial recognition AIs into a tailspin. They depend on a face looking the same every time. If new moles, or other changes in the appearance of a face change everyday, then the AI will fail badly.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a BestNetTech Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

BestNetTech community members with BestNetTech Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the BestNetTech Insider Shop »

Follow BestNetTech

BestNetTech Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the BestNetTech Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
BestNetTech needs your support! Get the first BestNetTech Commemorative Coin with donations of $100
BestNetTech Deals
BestNetTech Insider Discord
The latest chatter on the BestNetTech Insider Discord channel...
Loading...