The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.
The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found…
But ensuring that these systems are fair is only part of the task, said Maria De-Arteaga, a researcher at Carnegie Mellon University who specializes in algorithmic systems. As facial recognition becomes more powerful, she said, companies and governments must be careful about when, where, and how they are deployed.
“We have to think about whether we really want these technologies in our society,” she said.
More>>