The oft-touted power of big data is a double-edged sword, with the potential to both uphold and dismantle entrenched systemic inequalities, according to panelists at yesterday’s “Algorithmic Justice: Race, Bias and Big Data” event.
The University’s Center for the Study of Race and Ethnicity in America and the Data Science Initiative hosted five experts, including journalism and political science professors, two co-founders of Data for Black Lives and a co-founder of Mapping Police Violence and Campaign Zero. The speakers addressed how data, algorithms and machine learning can perpetuate existing social inequalities, as well as how data can be used to advance social justice.
The event was introduced by Tricia Rose, director of CSREA, and Bjorn Sandstede, professor of applied mathematics and director of the DSI. Sandstede introduced the panelists and stressed the importance of looking beyond the technical impacts of data science to its societal impacts.
Data challenges structural inequalities
Panelist Yeshimabeit Milner ’12 is founder and executive director of Data for Black Lives, an organization that harnesses the power of data to affect change in areas like mass incarceration that disproportionately affect black people. She detailed how she used data to show the impact of the school-to-prison pipeline and the racial disparities it perpetuated.
“I was sold on the power of data and its possibilities,” Milner said, adding that she was motivated by people in her own community to study issues that affect their lives. Personal stories and lived experiences are “one of the most potent forms of data in our arsenal.”
After studying ways to address societal inequalities through data science, Milner went on to found Data for Black Lives, aiming to connect scientists with activists to build a movement that can leverage data’s power for social change. The organization has now built a network of over 4,000 activists and scientists.
“We are calling for a new era: An era when data will be recognized as a tool for profound social change,” she said. “We believe that this is one of the most important civil rights issues of our time.”
This message was echoed by Milner’s colleague Max Clermont ’11 MPH ’12, co-founder and head of policy of Data for Black Lives. Clermont stressed the importance of finding those absent from conversations about the effects of big data, urging audience members to think less conventionally about what makes a person an expert.
“There’s potential and opportunity for data to be used for good if it’s in the hands of the right people,” Clermont said.
According to Milner, Data for Black Lives aims “to create a new narrative, to transform the role that data plays in public life on a local and national level.”
“One of the problems with these systems is that they only understand bias as individual irrational thinking. … But of course, bias is systemic and structural,” said Virginia Eubanks, an associate professor of political science at University of Albany, SUNY, and panelist at the event.
Technology can reinforce existing biases and inequalities
Panelist Meredith Broussard, assistant professor of journalism at New York University, spoke about technochauvinism, which she described as “the perspective that technology solutions are better than human solutions, and in fact that technology is better than people.”
Broussard explained that there are some situations in which mathematical algorithms can yield a solution, and others that call for human judgement. Developing technological solutions requires a diverse team, she said, adding that “we all have unconscious biases, and that’s what we embed in the technology that we create. … When we have diverse groups of people creating technology, we can create technology that is better for a greater number of people.”
Eubanks also spoke about her book, “Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor.” She shared an example from the book that critiques a tool that screens families in Allegheny County to identify children who are at elevated risk for neglect and abuse.
According to Eubanks, this tool introduced bias into decision-making concerning which families were investigated, because it only had information about families who had accessed public resources — often, families from low-income communities. Eubanks explained that the screening tool moved the discretion in making decisions from frontline human workers to machines that relied on a limited and biased dataset. She said that this is a form of “bias laundering” and poverty profiling.
Data can affect powerful social change
Despite the dangers of using data to address systemic injustice, Samuel Sinyangwe, co-founder of Mapping Police Violence and Campaign Zero, spoke about data’s power to advance racial justice.
After the 2014 murder of Michael Brown, there was no data available on the numbers of black people killed by police in the United States, Sinyangwe said. Without data, concerns about racial bias in police brutality were dismissed by researchers, academics and police chiefs.
“They were dismissed because they did not have the data to validate their own lives,” Sinyangwe said.
Sinyangwe and his colleagues built a comprehensive database mapping police violence in the United States to address this deficit. They collected information from crowdsource databases, media reports and public records requests, and discovered that police killed over 300 black people in 2014. They also learned that black individuals are three times more likely than their white counterparts to be killed by police and that black victims are more likely to be unarmed when killed by police.
“This data would not be possible to collect if we relied on (government) institutions to collect it for us,” Sinyangwe said.
Technology can be used to scale up activism, and data can debunk myths and bolster policy proposals, he said.
“Data can help us understand the scale of an issue and how it impacts communities,” he said.
Data is only powerful when communicated effectively to a broad audience, Sinyangwe said. In an interview after the panel, he explained that this can be very difficult in a world where people consume news and information “while they scroll up their timeline, and they only have three seconds or four seconds to actually intake that information before they scroll past it.”
Sinyangwe added that getting across the bottom line through fewer than 140 characters or through a single data visualization is “what’s needed to actually reach a broader number of people.”
Data’s complex ability to uphold and dismantle systemic injustice
A large audience attended yesterday’s panel discussion, including many students from the University and the University of Rhode Island.
Dominique Engome, a PhD student at URI, said that her biggest takeaway from the event was the importance of looking beyond fine-tuning an algorithm and instead to focus on larger structural problems that lie at the root of social issues.
The panel made her consider “at what point do you … bring other people to the table and … stop ignoring that big underlying problem?” she said.
But the panelists did stress that data still has an important role to play in social change.
“In the age of big data, unless we are aware of this history, unless we know this history, we risk repeating it,” Milner said.
“We hope that people came away with a clear sense of how powerfully influential algorithmic bias is, especially for vulnerable populations,” Rose wrote in an email to The Herald.
Broussard closed her portion of the panel discussion with an expression of hope for the future. “We have far to go, but we are moving along the path, I hope,” she said. “Because if we build AI systems that are based on the data in the world right now as it is, we’re never going to get to the world as it should be. And I want to see the world as it should be.”