Science & Research

NYU professor questions big data collection

Nissenbaum’s lecture argues that private commercial motives harm public interest, liberty

By
Senior Staff Writer
Wednesday, March 16, 2016

The phrase “big data” — and the vast, daunting numbers it implies — is thrown around a lot these days. To illustrate the concept, Helen Nissenbaum, professor of media, culture and communication and computer science at New York University, turned to something much more familiar: the emoji.

In a lecture titled “Must Privacy Give Way to Use Regulation?” Nissenbaum discussed the incongruity between consumers’ expectations and companies’ practices when it comes to data collection. To illustrate cases where business interests align with the public’s interests, Nissenbaum used two smiley emojis. Where consumers saw no benefit, she selected a neutral face; where data use was contrary to the public interest, she used a frown.

“We are given the story that we should not resist (big data) collection because we benefit so much from data machinations,” Nissenbaum said. “But the reality is that business imperatives often don’t match the public interest.”

This reality becomes problematic in light of what Nissenbaum termed “big data supremacy,” the notion that data collection — internet companies recording clicks, hospitals testing medical samples and credit card companies tracking purchases — simply cannot be stopped.

According to the “big data supremacy” argument, because data collection is hard to regulate, only data use should be subject to regulation. But if entities are permitted to collect as much data as they wish, the data will not always be used in ways that benefit the public, she said.

To prove this point, she offered Google’s Street View image capture as an example. “In 2010, it was discovered that Google Street View vans were not only photographing exteriors, but were also picking up all unencrypted Wi-Fi signals,” she said. If one believes in big data exceptionalism, then Google’s data collection is not an issue, so “that’s why this thesis is something we need to worry and care about.”

Regulators have yet to find success controlling companies’ use of data, Nissenbaum added. “The European Union, which is incredibly motivated to curtail some of these activities, does not seem to have had a big impact,” she said. “It seems like Facebook and Google just allow it to win a few times to make it feel okay.”

Nissenbaum believes that too often, regulators concede that Facebook and Google simply cannot be stopped. “I would like to push back on ‘cannot,’” she said. “Difficult does not mean impossible. Ending the narcotics trade and financial fraud is difficult, but it doesn’t mean we throw up our hands and say we can’t do it.”

And even if companies or governments never use data to exploit the public, simply possessing data confers an unethical advantage, she said.

“The problem at hand is one of domination,” Nissenbaum said. “Not the fact that a government is curtailing your freedom but that the government has the power to curtail your freedom. Simply having the data gives actors the power, even if they don’t exercise it.”

Nissenbaum, who received a PhD in philosophy at Stanford University, believes basic notions of liberty are at stake. “To be free, we need to free ourselves of others having the power to act in relation to us in an arbitrary way,” she said.

She also recognized that conversations held in academia need to be transferred to technical and regulatory settings to have the greatest impact. “As a philospher, you can wave your hands and have big ideas,” she told The Herald. “But you have to deal with the contingencies of material reality. I like doing that. I like being hands on and practical when you have great partners to work with.”

Lucy Van Kleunen ’17 found the private-public dynamic especially important. “I study computer science and public policy so I really like to look at technology from a social science perspective,” she said. “It’s important to think about ways that technology has both become a tool and also has become something that might be dictating our lives.”

Nissenbaum had answers for most questions — offering biblical examples of privacy to counter the notion that privacy is a new phenomenon, for example — but not all, recognizing that the philosophical and legal frameworks have not completely kept pace with technology. When asked about what can be classified as “big data,” Nissenbaum laughed. “We’re going to be here until seven (o’clock) because my head is breaking thinking about when data comes into existence.”