Facial Recognition Aims To Get Better By Using Dark Skin
Google is under fire for its role in the ongoing facial recognition debate. Recently it was reported that the major tech entity targeted a homeless population in an effort to improve the way this technology recognizes dark skin. The implications of what this means for Black people might be devastating because of how facial recognition has long worked against the interests of people with skin that isn’t lighter or whiter.
Subcontracted workers were sent out to collect data for Google. The Guardian reported that they were “instructed to target people with ‘darker skin tones’ and those who would be more likely to be enticed by the $5 gift card, including homeless people and college students.”
A contractor who worked for them stated, “They said to target homeless people because they’re the least likely to say anything to the media.” This raised concerns about the ethics of the project and what would be done with the data itself.
The subcontractors used tactics that were meant to deceive people, such as telling them they were playing a “selfie game” and that they were taking part in a survey. People were pressured to sign consent forms without reading them while the phones they were given recorded footage of their faces. The deception used to collect the data led to more questions about the use of facial recognition itself which has long had issues around race.
For example, it was also reported that Amazon’s facial recognition technology misidentifies Black people. A test performed by the ACLU “falsely identified 28 members of Congress as people who have been arrested for crimes” and of those misidentified, the majority were Black. Ironically, in Google’s case, their project was allegedly being done to help reduce racial bias in facial recognition.
Yet, how the company went about it points to an underlying distrust Black Americans have for “research” like this as well as distrust in the collections of data that are likely to aid law enforcement.
Black activists in cities like Detroit have been fighting against law enforcement’s growing interest in using facial recognition technology. This has been happening largely because of the research and experiments that highlight the failures of this technology when it comes to correctly identifying Black people. Since Black people already have long-standing issues with identification and law enforcement, what’s happening now is only making that distrust deeper and worse in a more modern context.
We have a quick favor to ask:
PushBlack is a nonprofit dedicated to raising up Black voices. We are a small team but we have an outsized impact:
- We reach tens of millions of people with our BLACK NEWS & HISTORY STORIES every year.
- We fight for CRIMINAL JUSTICE REFORM to protect our community.
- We run VOTING CAMPAIGNS that reach over 10 million African-Americans across the country.
And as a nonprofit, we rely on small donations from subscribers like you.
With as little as $5 a month, you can help PushBlack raise up Black voices. It only takes a minute, so will you please donate now?