
Washington DC is Home to the most powerful government on earth. It’s also home to 690,000 people — and 29 obscure algorithms that affect their lives. City agencies use automation to screen housing applicants, predict recidivism, identify food aid fraud, determine whether high school students are at risk of dropping out, inform sentencing decisions for young people, and many other things.
A snapshot of semi-automated urban life comes from a new report from the Electronic Privacy Information Center (EPIC). The nonprofit spent 14 months investigating the city’s use of algorithms and found that they are used by 20 agencies, with more than a third deployed in police or criminal justice departments. For many systems, city agencies won’t provide full details of how their technology works or is used. The project team concluded that the city may be using more algorithms that they couldn’t detect.
The findings are also noteworthy outside of DC, as they add to evidence that many cities have quietly applied bureaucratic algorithms to their departments, where they can contribute to decisions that affect citizens’ lives.
Government agencies often turn to automation in hopes of improving the efficiency or objectivity of bureaucratic processes, but citizens often have a hard time knowing they are working, and some systems are found to be discriminatory and lead to decisions that disrupt human lives. In Michigan, an unemployment fraud detection algorithm with a 93 percent error rate resulted in 40,000 false fraud charges. A 2020 analysis by Stanford and New York University found that nearly half of federal agencies are using some form of automated decision-making system.
EPIC delves into a city’s use of algorithms to understand the many ways they can impact citizens’ lives, and encourages people elsewhere to practice similar exercises. Ben Winters, who leads the nonprofit’s work on artificial intelligence and human rights, said Washington was chosen in part because about half of the city’s residents identify as black.
“Typically, automated decision-making systems have a disproportionate impact on black communities,” Winters said. The project found evidence that autonomous traffic enforcement cameras are disproportionately placed in neighborhoods with more black residents.
Cities with large black populations have recently played a central role in the movement against municipal algorithms, especially with regard to policing. Detroit was at the center of a debate over facial recognition after the false arrests of Robert Williams and Michael Oliver in 2019 for algorithms that misidentified them. The deployment of facial recognition technology in Baltimore in 2015 following the death of Freddie Gray in police custody sparked the first congressional investigation into law enforcement’s use of the technology.
EPIC looks for algorithms by looking for public disclosures by city agencies and submits public records requests, request contracts, data-sharing agreements, privacy impact assessments, and other information. Six of 12 city agencies responded, sharing documents such as a $295,000 contract with Thomson Reuters-owned Pondera Systems, which makes fraud detection software called FraudCaster to screen food aid applicants . Earlier this year, California officials found that more than half of the 1.1 million claims made by residents of the state that Pondera’s software was flagged as suspicious were actually legitimate.