Skynet, But For Welfare: Automating Social Services Is Killing People
from the our-hands-are-clean-said-the-gov't-since-it's-just-robots-doing-the-work dept
We've talked before about the over-reliance on tech to do certain jobs that cannot be simplified to the sum of mathematical parts. The criminal justice system is starting to turn over sentencing to algorithms -- something that seems like the smart thing to do but removes judicial and prosecutorial discretion from the mix, leaving defendants with the unpalatable option of challenging software they're never allowed to examine.
Police departments are also moving towards predictive policing. Relying on historical data, cops are hoping to head off future crimes by allocating resources to areas where crime appears to be more likely to be committed. Sounds good on paper, but in reality, all it does is reinforce biases and push law enforcement to treat everyone in targeted areas as criminals. If the data being fed in reflects biased policing, crunching the numbers even harder isn't going to erase that. It's only going to reinforce it. And, again, suspected criminals aren't able to access the data or software that puts them in law enforcement's crosshairs.
A certain amount of automation is expected as government agencies seek to streamline public services. The problem isn't necessarily the tech. It's the removal of human interaction. As has been stated here frequently, moderation at scale is impossible. So is automated governing. Automated processes are as prone to failure as the people overseeing them. But when you decide software is going to do almost all of the work, those who need the assistance of other humans most are cut out of the loop.
Citizens looking for government assistance have grown accustomed to jumping through red tape hoops. Now, the hoops are inaccessible, but still must be jumped through. The most marginalized members of society are given URLs instead of contact names and numbers when many of them have no reliable access to the internet or a computer. A new series by The Guardian shows the human cost of going paperless. It's happening all over the world, and it's literally killing people.
The most disturbing story comes from Dumka in India. Here, we learn of the horrifying human impact that has befallen families as a result of Aadhaar, a 12-digit unique identification number that the Indian government has issued to all residents in the world’s largest biometric experiment.
Motka Manjhi paid the ultimate price when the computer glitched and his thumbprint – his key into Aadhaar – went unrecognised. His subsistence rations were stopped, he was forced to skip meals and he grew thin. On 22 May, he collapsed outside his home and died. His family is convinced it was starvation.
That's the worst case scenario. But there is plenty of ugliness in between. Governments aren't looking to defense budgets or law enforcement agencies to make cuts. Instead, they're adding to their bottom lines by pursuing citizens they think have screwed the government. Another process being automated is governments' attempts to collect alleged overpayments of social services funds. There's a statute of limitations on most debt, but software is being used to resurrect ancient debt governments feel they're owed. This is resulting in the destruction of people's lives and finances as they find themselves unable to challenge these automated determinations.
In Illinois, the Guardian has found that state and federal governments have joined forces to demand that welfare recipients repay “overpayments” stretching back in some cases 30 years. This system of “zombie debt”, weaponized through technology, is invoking fear and hardship among society’s most vulnerable.
As one recipient described it: “You owe what you have eaten.”
It's not just "zombie debt." It's also "robodebt." The determinations of owed debt are made by automated processes. The collection process is also automated, separating those suddenly facing possibly undeserved clawbacks from the human assistance they need to determine whether or not the claim is valid. Bureaucracies have always been faceless. With the addition of cold calculations, they've weaponized this facelessness to deter citizens from pushing back against a decision-making process composed of 1s and 0s. This, too, is linked to a rising human cost.
Compare and contrast the public statements with reality. Here's how the UK government is pitching its transition to automated governance:
“We are striking the right balance between having a compassionate safety net on which we spend £95bn, and creating a digital service that suits the way most people use technology,” said a DWP [Department for Work and Pensions] spokesperson. “Automation means we are improving accuracy, speeding up our service and freeing up colleagues’ time so they can support the people who need it most.”
If this sounds like a positive development, it's only because you haven't seen it in action.
[C]laimants have warned the existing automation in UC’s “digital by default” system has already driven some to hunger, breakdown and even attempted suicide. One described the online process as a “Kafka-like carousel”, another as “hostile” and yet another as a “form of torture”. Several said civil servants already appeared to be ruled by computer algorithms, unable to contradict their verdicts.
The same thing is going on in Australia. The government's embrace of "austerity," combined with its fondness for automating social services and debt collection processes has resulted in a world of hurt for the many Australians unable to challenge automated determinations or connect with actual humans willing to help them through the process. Asher Wolf's Twitter feed is an invaluable resource for these issues, detailing the fallout of automated social assistance programs all around the world.
It's not that all automation is bad. It's that automation without strict oversight or human control isn't making anything better for the most vulnerable members of society. When governments pay millions to let machines make decisions affecting humans, humans almost always seem to come out on the losing end. Legislators may proudly display charts showing incremental gains in efficiency, but few are willing to discuss the constituents they've sacrificed on the altar of automation. No system is perfect, but one that relies more on math than human discretion isn't an improvement.
Filed Under: aadhaar, algorithms, automation, criminal justice, india, law enforcement, police, social services