
Facial recognition technology can help police officers identify—and ultimately charge—criminals caught on camera. |
But many argue that the technology, though useful, is discriminatory. Research shows that facial recognition software misidentifies people of color more frequently than it does white individuals, an issue that brings the technology’s reliability into question. In 2019, Michael Oliver was arrested during a routine traffic stop and charged with larceny for stealing a cell phone—a crime he didn’t commit. Oliver was misidentified when Detroit police ran a facial recognition search in the state of Michigan photo database using technology created by an outside company. Oliver’s is one of two false arrests the city of Detroit, Michigan, is facing lawsuits for as a result of misidentification by facial recognition technology. Why is it difficult for this technology to recognize people of color? And do legal, privacy, and human rights concerns outweigh the benefits of its use? Watch NOVA's Algorithmic Injustice? Racial bias and facial recognition to learn more. |
Follow Us