Skip to Content, Navigation, or Footer.

Virtual panel discusses technology in policing

Panelists say technology can reinforce inequality or be tool for accountability

“When we analyze technologies of any sort, but especially those of policing, we should really ask whose imagination and utopia is being manifested,” said Cierra Robson, associate director of the Ida B. Wells JUST Data Lab and doctoral student at Harvard, during a panel titled “Policing and Technology” Wednesday. “Someone’s utopia is likely someone else’s dystopia,” she said.

The Center for the Study of Race and Ethnicity in America hosted the webinar in collaboration with the Department of Computer Science as part of the Technology and Structural Inequality Series, a collection of discussions around the intersection of technology and inequity. The panel, which was moderated by Associate Professor of Sociology Nicole Gonzalez Van Cleve, discussed how technology has exacerbated racial discrepancies in policing, but can also be used to monitor police conduct and encourage accountability.

Cynthia Khoo, technology and human rights lawyer and research fellow at the Citizen Lab at the University of Toronto, explained her research on technology used by Canadian police, such as algorithmic and predictive policing. Algorithmic surveillance includes facial recognition, social media monitoring and license plate readers, and predictive policing includes the use of data to track where crimes are statistically likely to take place.

“The police have unprecedented access to volumes of data, whether it’s through mass surveillance or through connections with non-law enforcement government agencies or private-sector companies,” Khoo said.

Khoo explained that because algorithms are created by people, they can reinforce systemic biases despite an appearance of objectivity. 

“Social media surveillance has been used to target social movements, such as Indigenous rights movements and Black Lives Matter,” Khoo said.

During her presentation, Robson discussed one notable instance of police trying to implement new technologies. In 2013, Oakland, California residents protested against the proposed Domain Awareness Center, a planned surveillance hub for police that would have significantly ramped up video surveillance in public spaces throughout the city. 

She described how the DAC plan reflected Oakland’s history of redlining and segregation. “The (map of) DAC surveillance cameras (is) an exact replica of the redlining map of the 1930s, meaning that those who are being watched are mostly Black and brown residents of Oakland,” Robson said.

Samuel Sinyangwe, panelist and co-founder of Campaign Zero, said that technology can also be used to work against these systemic inequalities. For example, Campaign Zero is lobbying for police reform by “using technology to actually collect data on the police that the police weren’t willing to report themselves,” he said.

Sinyangwe led research for Mapping Police Violence, which visualizes the frequency and location of police misconduct in the U.S., and the Police Scorecard, which compiles data to evaluate police departments.

He said that though “predictive policing is incredibly problematic,” there is a data science field “running in parallel, which suggests that we could actually flip predictive policing on the police (to) use it to better remove officers that are at the highest risk of using force in the future.”

Sinyangwe pointed out that Campaign Zero is not the only organization using data to reform the police: The Citizens Police Data Project uses data compiled and visualized by the Invisible Institute and the University of Chicago to map incidents of police violence in Chicago and find the specific police officers who received the most complaints. He explained that the research showed that police who worked closely with officers who had high rates of misconduct began to exhibit similar tendencies.

“Essentially, what that means is that we can start to contact trace police violence like you would contact trace coronavirus,” Sinyangwe said.

In response, an attendee asked whether using algorithms and predictive technology to reform policing validates a method that has been shown to be biased when used in other contexts.

“This (debate) often gets theoretical and philosophical, and I am very practical,” Sinyangwe responded. “If we can use this (technology) to hold the officers accountable, I think we should.”

Khoo said that turning predictive technologies back on law enforcement was ethically different because “you cannot ignore the fact that the power dynamics only go in one direction.”

She added that at times, focusing on specific policing technologies can be “a red herring” that distracts from systemic problems in policing. Khoo said that though current discussions of police technology are focused on facial recognition and predictive policing, “20 years from now, there will be another technology and we’ll all be talking about that, but the underlying issues will still be exactly the same.”

“Even though we’re talking about the technology, it’s not about the technology,” Khoo said. “It is about why there is so much policing in the first place.”

ADVERTISEMENT

Katy Pickens

Katy Pickens is the managing editor of newsroom and vice president of The Brown Daily Herald's 133rd Editorial Board. She previously served as a Metro section editor covering College Hill, Fox Point and the Jewelry District, housing & campus footprint and activism, all while maintaining a passion for knitting tiny hats.



Powered by SNworks Solutions by The State News
All Content © 2024 The Brown Daily Herald, Inc.