Should Facial Recognition Technology Be Stopped? – Read Why

In 2014 Ronald Carnes  applied for his driver’s license in Waterloo, which got him behind bars.
If facial recognition bad

Carnes had been convicted in November 1970 in North Carolina of robbery with a dangerous weapon and was sentenced to 20 years in prison. He escaped in 1973. Authorities failed to determine his whereabouts according to WcfCourier

At the Department of Motor Vehicles, a clerk told Carnes not to smile — or, at least, not show your teeth — because Iowa was among the first states — now numbering at least 39 — to use facial recognition software alerting law enforcement to criminals and fraud, including identity theft.
The software converts your driver’s license image into an algorithm gauging your unique facial features such as the distance from the tip of your nose to your chin or between eye pupils and the measurement of your cheekbones.
If two or more identities are associated with that image, law enforcement will try to determine why. Carnes had used the two other identities to collect Social Security benefits and apply for Iowa drivers’ licenses, becoming the first criminal captured using DMV face recognition.
Congress never passed legislation giving the Federal Bureau of Investigation or any other agency authority to embark on face recognition databases, but post-9/11 anti-terrorism bills, including the 2005 Real ID Act, spurred the development. Neither have state legislatures given formal blessing.
Face recognition proceeded on the basis of presumed public good, and there was much to recommend it as identity theft grew.
Neil Stammer, 49, a New Mexico musician and juggler, faced child sex abuse and kidnapping charges in 1999 when he went missing. When the FBI tested passports with face recognition for fraud in 2015, “Kevin Hodges” was arrested in Nepal teaching English.
However, it’s an imperfect system requiring training to optimize its use. A Massachusetts Institute of Technology Media Lab study also found it’s far more accurate identifying light-skinned people.
It claimed an Amazon system misidentified the gender of darker-skinned women in 30% of tests — Microsoft and IBM did better — failing to determine Michelle Obama and Oprah Whitney’s gender.
Robert Julian-Borchak Williams, an auto supply company executive, was handcuffed by Detroit police officers in front of his wife and two young daughters on suspicion of shoplifting five watches valued at $3,800.
One surveillance photo showed a heavyset man in black, wearing a red St. Louis Cardinals cap, standing at a watch display. Yet a second photo looked nothing like him. “No, this is not me,” Williams said. “You think all black men look alike?”
The American Civil Liberties Union last month helped exonerate him.
Detroit Police Chief James Craig admitted, “If we were just to use the technology by itself to identify someone, I would say 96 percent of the time it would misidentify.”
The FBI maintains its system can determine the actual offender among its top 50 profiles 85% of the time — if the actual perpetrator is in the group. If not, it’s susceptible to false positives. That could mean any innocent man like Williams will face the cost of bail and legal fees.
Face recognition also can be problematic when manipulated by overzealous law enforcement agencies trying to match driver’s license photos to an artist’s sketch or surveillance cameras capturing individuals from different angles.
The Chinese police state provides some early warnings.
With cameras nearly everywhere — although actually fewer than in the U.S. on a per capita basis — it documents behavior to an unnerving degree, including matching IDs with photos taken in public restrooms of individuals using too much toilet paper.
China calls its ubiquitous program “Sharp Eyes,” combining face recognition, artificial intelligence, criminal and medical records, travel, online purchases and social media to determine suspects, assess suspicious behavior and supposedly even predict crime.
Citizens get “social credit” scores based on trustworthiness.
Yet it also makes a case against using face recognition to gather information on individuals at peaceful protests, such as have occurred in the U.S.
Among the “criminal” class in China, according to a document found by Human Rights Watch, are people complaining about perceived injustices, who “undermine stability” or have “extreme thoughts.”
Congress has taken notice of the fledgling industry.
Rep. Jim Jordan, R-Ohio, was among those complaining about states giving drivers’ license photos to federal authorities.
“They’ve just given access to that to the FBI,” he said. “No individual signed off on that when they renewed their driver’s license, got their driver’s licenses. They didn’t sign any waiver saying, ‘Oh, it’s OK to turn my information, my photo, over to the FBI.’ No elected officials voted for that to happen.”

Jordan got that right, but the rhetoric must lead to regulation. Amazon, Microsoft and IBM recently put their systems on hold, awaiting direction.

Hits: 4

Leave a Reply