Artificial Intelligence Crime

With a blind police incident report displayed, San Francisco District Attorney George Gascon, left, and Alex Chohlas-Wood, Deputy Director, Stanford Computational Policy Lab, talk about the implementation of an artificial intelligence tool to remove potential for bias in charging decisions, during a news conference Wednesday, June 12, 2019, in San Francisco. In a first-of-its kind experiment, San Francisco prosecutors are turning to artificial intelligence to reduce bias in the criminal courts. (AP Photo/Eric Risberg)

SAN FRANCISCO — In a first, San Francisco prosecutors are turning to artificial intelligence to reduce racial bias in the courts, adopting a system that strips certain identifying details from police reports and leaves only key facts to govern charging decisions.

District Attorney George Gascon announced Wednesday that his office will begin using the technology in July to “take race out of the equation” when deciding whether to accuse suspects of a crime.

Criminal-justice experts say they’ve never heard of any project like it and applauded the idea as a bold effort to make charging practices more color blind.

Gascon’s office worked with data scientists and engineers at the Stanford Computational Policy Lab to develop a system that takes electronic police reports and automatically removes a suspect’s name, race and hair and eye colors. The names of witnesses and police officers will also be removed, along with specific neighborhoods or districts that could indicate the race of those involved.

“The criminal-justice system has had a horrible impact on people of color in this country, especially African-Americans, for generations,” Gascon said. “If all prosecutors took race out of the picture when making charging decisions, we’d probably be in a better place as a nation.”

Gascon said his goal was to develop a model that could be used elsewhere, and the technology will be offered free to other prosecutors across the country.

“I commend them,” said Lucy Lang, a former New York City prosecutor and executive director of the Institute for Innovation in Prosecution at John Jay College of Criminal Justice. “Any efforts to minimize disparate outcomes are laudable.”

The technology relies on humans to collect the initial facts, which can still be influenced by racial bias. Prosecutors will make an initial charging decision based on the redacted police report. Then they will look at the entire report, with details restored, to see if there are any extenuating reasons to reconsider the first decision, Gascon said.

Lang and other experts said they look forward to seeing the results and that they expect the system to be a work in progress.

“Hats off for trying new stuff,” said Phillip Atiba Goff, president for the Center for Policing Equity. “There are so many contextual factors that might indicate race and ethnicity that it’s hard to imagine how even a human could take that all out.”

Studies have long shown that bias exists nationwide at all levels of the criminal justice system, from police making arrests and prosecutors deciding whether to charge suspects to court convictions and sentencing.

A 2017 study commis-sioned by the San Francisco DA found “substantial racial and ethnic disparities in criminal justice outcomes.” African Americans represented only 6% of the county’s population but accounted for 41% of arrests between 2008 and 2014.

(0) comments

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.