Transparency Is Key to Curbing the Power of Big Tech

As regulators, activists and researchers wrestle with the issues raised by the social media giants, visibility is a central challenge: only people inside the companies have access to good information.

August 2, 2021
2021-07-29T120148Z_1_LYNXMPEH6S0SZ_RTROPTP_4_FACEBOOK-RESULTS.png
REUTERS/Dado Ruvic

On Tuesday, July 27, Google’s parent company Alphabet reported revenue of $61.9 billion for the quarter and profits of $18.5 billion. The next day, Facebook reported revenue of nearly $29.1 billion and profits of $10.4 billion.

Also on Tuesday, the World Health Organization (WHO) announced 3.8 million new COVID-19 cases, an eight percent increase over the previous week.

The global pandemic that has killed 4.2 million people has not stopped Facebook and Google from cementing their online dominance, which may explain why they aren’t doing more to stop the misinformation that is making it harder for public health authorities to stop COVID-19. 

WHO Director-General Tedros Adhanom Ghebreyesus put it this way: “We’re not just fighting a pandemic; we’re fighting an infodemic.”

There is reason to believe that the scale of the problem is not knowable, at least not beyond the executive suites of Alphabet and Facebook, because only the tech titans possess the data that could quantify the scale of this problem.

Regulators and law makers do not have the information they need to understand what is happening behind the walled gardens of the platforms. And the social media companies refuse to share this most important information — apparently, because it might hurt their profits.

In 2018, Gary King was in Menlo Park, California, trying to convince Facebook executives to share their data with researchers.

King, the director of Harvard University’s Institute for Quantitative Social Science, has been trying for years to get access to more of the information held by social media companies.

“Although we have more data than ever before, we have a smaller fraction of data in the world about people and groups than ever before, because it used to be that we created all the data, so we had it all, or governments created it, who gave it to us,” he said in an interview. “But nowadays, most of the data about people in groups in society are actually tied up inside private corporations. We have to figure out how to work with them. We have no choice.”

King made his pitch to Facebook executives, had some meetings, made some connections, but didn’t convince anyone to open the company’s data vaults. 

“I didn’t really make any more progress. I went back to my hotel room and I was packing to go home. And I got an email from them that said, ‘Hey, what do we do about this?’ And this was Cambridge Analytica. That was the biggest violation of privacy up until that point. And I thought, OK, this was the worst-timed lobby event in history.”

King thought the trip was a waste, but then he got a phone call a few days later from Facebook CEO Mark Zuckerberg, who wanted to ask him if he could look into the 2016 US presidential election for Facebook.

Zuckerberg was under pressure at the time. The Cambridge Analytica scandal had knocked $118 billion off Facebook’s market capitalization within days. It provided an opportunity for King.­

“I said I would love to do a study of the 2016 election, but I need two things and you’re only going to give me one. He said, ‘What are those two things?’ I said, ‘I need complete access to the data,’ of course…but then I said, ‘I need…freedom of speech. I need to be able to publish without your prior approval.’”

Zuckerberg couldn’t accept that. “He said, ‘Yeah, we can’t give you both of those things.’ I said, ‘Well, OK then. Unfortunately, I’m not going to do the study.’ And he said, ‘Oh no. We still want you to do the study.’”

The two men went back and forth for a while and eventually figured out a way to proceed with a complicated plan where some researchers who were given access to sensitive data signed non-disclosure agreements so they could clear data for other researchers. That resulted in Social Science One, a data portal for academic researchers.

“There was just an update a few days ago,” said King. “Actually, it has 40 trillion differentially private numbers, which is the way the Census Bureau implements data.”

The project was supposed to take a few months to organize but ended up taking two years.

“I have battle scars from this fighting with all the different armies inside Facebook,” said King. “It’s just very complicated. They’re a very insular group that focuses on their own issues, which is no surprise.”

He said he thinks the process was harder than it needed to be, adding that Facebook could have been much more helpful.

“Could they go much faster in sharing data with academics? Definitely. [There are] lots and lots of other ways, other data sets, other methods of cooperation with academics that they could adopt that would be productive, that we could learn a lot about misinformation and many other things that would improve public good and wouldn’t hurt the company.”

Former Facebook executive Brian Boland recently made similar comments after leaving the company. Boland, then a vice president, was working at CrowdTangle, a data analytics tool owned by Facebook. He’d been pushing for the company to proactively release more information, whether it made Facebook look good or not.

“People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he told The New York Times. “Then, the tone at the executive level changed.”

Boland left Facebook after 11 years, he said, because the company “doesn’t want to make the data available for others to do the hard work and hold them accountable.”

Boland described a struggle within Facebook between people who want to routinely open up its processes and others with a more traditional, defensive corporate view. 

King saw the same thing.

“What basically happened is they came up with these novel businesses, and novel ways of organizing the human population, that has never been done before in the history of the world,” said King. “The odds that they will get it massively wrong sometimes is not remotely surprising. I think they should be open about it and work with everybody. And if everybody knew that there was a mistake, like, why should that be surprising? It would be OK. That they would learn from it and they fix it. They would like to fix it. But their real problem is that they lose money every time they have some big controversy.” 

During the COVID-19 pandemic, much vaccine disinformation is shared in private groups, which are impossible for journalists or regulators to penetrate. Users sometimes use code words to defeat even Facebook’s controls.

As regulators, activists and academic researchers around the world wrestle with the issues raised by the social media giants, transparency is a central challenge because only people inside the companies know what is going on.

In previous election campaigns, in the days before social media, campaign advertisements were printed and publicly displayed or broadcast, so they could be debated and fact-checked in the public square. In the 2016 US presidential election, by contrast, Russian government-funded ads went directly to Facebook users — targeting Black Americans — a secret, divisive, foreign-funded distortion inside the world’s most powerful democracy, which Facebook downplayed until forced to reveal the extent of it.

The company ultimately corrected course and implemented measures — including an ad library — intended to prevent similar attacks in the future. But the drama eroded its credibility, making it difficult for journalists, researchers and regulators to take executives at their word. 

When Facebook now says that it is taking action to study anti-vaccination propaganda, should we believe it? It is not clear the company has acted quickly enough to stop viral hoaxes, and there is reason to believe that vaccine resistance is strongest among people who get their news from Facebook. A dozen disinformation superspreaders have a huge reach because of the platforms, feeding the vaccine resistance that is making it impossible to end the pandemic.  

The Joseph Biden administration is skeptical of Facebook’s claims that it is doing enough. The tech giants often announce policies that they fail to implement, according to Imran Ahmed, CEO of the Center for Countering Digital Hate, which investigates disinformation. 

During the COVID-19 pandemic, much vaccine disinformation is shared in private groups, which are impossible for journalists or regulators to penetrate. Users sometimes use code words to defeat even Facebook’s controls.

There are reasons to think the tech giants don’t even want to know what is going on inside their platforms. 

Heidi Tworek, a CIGI senior fellow and associate professor at the University of British Columbia, suggests we need to ponder “agnotology” — the study of willful ignorance — to understand this.  

She has written about a YouTube employee who, in 2018, discovered that potentially radicalizing alt-right content generated as much engagement as music, sports and gaming, which executives were reluctant to recognize, because acknowledging the source of the engagement might lead to responsibilities the company doesn’t want.

French civil service researchers who worked with Facebook in 2019 with a view to devising a new regulatory model pointed to the difficulty in understanding what is happening in private spaces.

“Unlike traditional media, the ordering of content on social network services is usually personalised (except in forums) and everyone sees the result of the personalisation when accessing the service,” they wrote. “However, the overall effects of this ordering on all users are not observable.”

Tworek thinks regulators may not understand what’s happening in the walled gardens well enough to regulate them.

“You could imagine if you have transparency regulation that actually enables you to ask companies to really give you a sense of what is going on,” she said. “So, if you are going to do any kind of regulation, you’d actually know what you’d like to make that on the basis of. Because even the companies themselves may not be investigating certain things. So, transparency is not the silver bullet, but it is a really important part of helping researchers and others actually get a sense of what’s really happening under the hood of these companies.”

Law makers in democracies around the world are struggling to determine how to regulate the social media space — a challenge given constitutional protections on free speech, the rise of authoritarian techno-nationalism, and a jumble of political priorities: countering hate and harassment, misinformation, algorithmic discrimination, funding deals for struggling newspaper companies, and ineffective content moderation for minority languages, for example.  

King believes the first step might be better research, which we can obtain only if researchers are allowed inside the walls of the tech giants’ citadels.

“What we could do is keep everybody’s posts private as they wish,” he said, “however, we share them with legitimate academic researchers, and we let legitimate academic researchers write whatever they want to write without prior approval from the company. And what the consensus of the academic community is will be public. And then we’ll learn a lot about how much misinformation there is on Facebook and the influence of the misinformation and whether it actually changes people’s opinions or it’s just a bunch of junk that appears in the newsfeed. That’s a policy intervention that could work. I think we could make a positive contribution.”

The social media companies won’t do that voluntarily. 

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Stephen Maher is a Harvard Nieman Fellow and a contributing editor at Maclean’s.