In a little more than a year, the internet will face a major wave of regulation mostly targeted to protect data and fight misinformation.
With the Jan. 1, 2024 implementation of the European Union’s Digital Services Act and the United Kingdom working on the Online Safety Bill, social networks and search engines will be subject to new requirements.
The largest internet companies will face regulations that require them to perform annual transparency reports discussing how they recommend content to their users and allow researchers to access data to better understand risks and implement measures to counter illegal content.
Still, regulation alone cannot tackle misinformation and privacy problems, said panelists at a Thursday afternoon event in the Watson Institute for International and Public Affairs hosted by the School of Public Health’s Information Futures Lab. Regulation only represents one piece of the puzzle while adding its own challenges and unintended consequences, said Claire Wardle, the panel’s moderator, professor of practice at the School of Public Health and co-director of the Information Futures Lab.
“People tell you, ‘we just need more regulation,’” Wardle said. “My biggest concern is that we regulate now with a knee-jerk reaction, and historians look back and say, ‘that was a stupid move — you broke the internet because you were so scared.’”
The United States likely will not implement its own regulations on misinformation, said panelist Suresh Venkatasubramanian, professor of computer science and data science. The European Union, he said, has “more of a willingness to do these things top-down,” but similar conversations are “very hard to start in the U.S.” because of its commitment to free speech.
“There’s limited hope on this side of the Atlantic,” said panelist Rebekah Tromble, director of The George Washington University’s Institute for Data, Democracy & Politics.
So Americans will be reliant on the impacts of soon-to-be implemented European regulations for the foreseeable future. Those impacts will prove significant, said panelist Mark Scott, a visiting fellow at the Information Futures Lab and chief technology correspondent at POLITICO, noting that platforms will likely not make different products for different countries.
“You can’t cordon off the internet,” Tromble said.
New laws abroad do not mean creating a “ministry of truth,” transforming government into the sole arbiter of what information the public can access, Scott said.
“Nobody wants free speech to die,” Scott said. “People should have free speech, but there are limits to a degree. You can’t just go around saying things that can cause harm — or, you can, but there are consequences.”
Beneficial internet regulation should not seek to eliminate misinformation, said panelist Anna-Sophie Harling, an online safety principal at Ofcom, the U.K.’s communications regulator. Instead, regulation should prevent the amplification of misinformation, she said.
Government should not target “content and spaces dedicated to misinformation,” Harling said. But it should make sure that the “average mom of four in Arizona who’s going on Facebook” does not get anti-vaccine groups recommended, she said.
Regulation of data brokers — who collect data from different apps and sources, often selling it to law enforcement — would represent a step forward for protecting privacy, Tromble said. All regulation raises a host of ethical and logistical problems, Tromble said.
While sharing data with researchers can improve risk assessments, it also creates a privacy problem, Tromble added. “Data always ties back to individual users,” she said. “There is a fundamental tradeoff between transparency on one hand and privacy on the other.”
And introducing new metrics for accountability can foster unintended consequences, Harling said. If Ofcom asks social networks to report how many pieces of misinformation it removes, those networks are incentivized to remove more information — both true and false. If the law calls for timely removal of illegal content, social networks might not properly identify content, she said.
Regulation also moves slowly, while internet products move fast, Harling said.
“If regulators finally get their act together, platforms have to publish metrics about the news feed,” she said. “That news feed might not exist anymore.” And newer apps, like TikTok, can present a challenge for regulators who are unfamiliar with them.
“The difference between regulating TikTok and Snapchat is like the difference between regulating a bicycle and a Hummer,” she said. “The point is that we can go (to developers) and ask intelligent questions, not just take for granted what they say is in place.”
Agencies charged with keeping big tech in check also tend to lack the size and resources to keep up, Tromble said.
And some misinformation targets small groups, Venkatasubramanian said, noting that “if you want to change something in a Congressional district, you don’t need to reach many people to do it.”
Regardless of new laws, the entire panel agreed that both governments and the public have become more aware of privacy, misinformation and the way that platforms show content to their users.
“Gen Z’s obsession with the algorithm on TikTok has helped really increase basic understanding of what’s happening on platforms, and how data is being collected and processed,” Tromble said. “And it’s generating a great deal of concern.”
Will Kubzansky is the 133rd editor-in-chief and president of the Brown Daily Herald. Previously, he served as a University News editor overseeing the admission & financial aid and staff & student labor beats. In his free time, he plays the guitar and soccer — both poorly.