
"Are you kidding?"
When Detroit police told her she was under arrest for carjacking and robbery, 32-year-old Porcha Woodruff pointed to her stomach. At eight months pregnant, she obviously wasn't their suspect. But instead of getting her daughters ready for school, they were left crying as police took her to jail.
This was 2023, when Woodruff became the first woman known to have been wrongfully accused based on facial recognition technology. Despite eyewitness accounts and surveillance images indicating otherwise, anti-Blackness and automation bias intertwined to criminalize Woodruff anyway.
Automation bias is our overconfidence in the decisions made by automated systems, despite risks, limitations, and non-automated information to the contrary. Authority bias works similarly with authority figures. Police weaponize both kinds of bias to facilitate anti-Black violence. When using AI facial recognition, police have skipped basic investigative steps, from checking alibis and contradictory evidence to verifying witness statements and noting visible tattoos and advanced pregnancies.
After suffering contractions for the 11 hours she was jailed, the stress of court weighed on Woodruff for the rest of her pregnancy. Last December, the ACLU filed a lawsuit on her behalf.
Automation and authority bias can shape our beliefs and decision-making more than we think. Biases make it easy to trust AI and authority figures that insist a Black person is "criminal," facts aside. We can resist oppressive systems by practicing our right to question how right their tools and actors are.