How do social aid programs decide who gets their resources? The decision is difficult, and front-line social workers have been tasked with this difficulty for decades. But with more and more frequency, these decisions are being handled by computers.
As we move further into the information age, the fear grows that our data will be used against us. Virginia Eubanks’ research seems to indicate that for the working class, this is already the case. She works with a program called Our Data Bodies, which is an academic research project that lets people in various communities speak about their data experiences. In her new book “Automating Inequality,” Virginia Eubanks unpacks eight years of research on the ways that algorithmic models are changing the landscape of poverty across America. During those eight years, she spoke to front-line aid workers, caseworkers, and most importantly, the recipients of government aid.
Eubanks gave a talk on her book Thursday night at Scuppernong Books in downtown Greensboro, and discussed the failings of computer models as aid-giving tools. Most cutting-edge technology, she claimed, is being used to monitor the poor ages before it reaches the middle class. During her talk, she gave several notable examples and quoted people she interviewed who had relied on the system for aid.
“The folks I talked to to write this book really took huge risks telling their stories,” she said. Having the government track and scrutinize one’s every move may still seems like science-fiction, but in her research, Eubanks found that it is the reality of life for many Americans.
People within the system risk losing food, housing and the custody of their children for all sorts of reasons, and the increased technological surveillance has exacerbated their problems. “You feel like a prisoner,” said one interviewee in a segment Eubanks read aloud from her book. “You feel trapped.”
She gave the example of EBT cards. When they were implemented around the turn of the century, they took away the shame of having to use food stamps in a grocery store. But she was surprised to find out that their digital records also allowed caseworkers to scrutinize their clients’ spending habits. “We made this decision as a country to use public service programs as ‘moral thermometers,’” she said Thursday. And technology, she argued, is only worsening the problem.
“These systems don’t actually remove discretion,” Eubanks said, referring to the many prejudices ingrained in the system. These prejudices can act as pitfalls for those seeking aid. “They move it.” In other words, decisions being made by caseworkers are now made by programmers who have never been face-to-face with poverty.
Eubanks also addressed the underlying facts of America’s poverty situation: in Los Angeles, for example, there is fundamentally not enough housing. Authorities are then left to employing algorithms as “regrettable but necessary mechanisms for performing triage,” said Eubanks. “We are rather than really struggling with the problems at the root… moving these decisions to machines.”
Eubanks believes that these findings have surprising relevance for the broader American political landscape. She spoke at length about Indiana, where politician Mitch Daniels—a mentor to current Vice President Mike Pence—spearheaded an enormous effort to automate the state’s welfare programs. “The Indiana case,” she said, “is the most blatant attempt to block people from getting the resources they are entitled to.” And the language used in that program “showed up almost word for word in the current administration’s budget proposal.” She also added that the Indiana case was not successful, and ultimately ended in costly failure.
Eubanks raised the notion that the language used to obstruct people from welfare will soon be turned against the middle class nation-wide, cutting from Social Security and Medicare. “It’s not about tweaking the tools,” she said. “These are some deep, underlying issues we really have to come to grips with.” She stressed the notion that biases and prejudices were more than just individual opinions. “They view biases as the individual decision making of the people, not as systemic,” she said. “A neutral system will produce inequalities.”
Copies of “Automating Inequality” are available for purchase at Scuppernong Books in Greensboro.
Categories: A & E