Congress Has the Power to Compel Transparency from Facebook: Will It?

The pattern seems clear. The social media platform is engaged in an energetic struggle to stop its role in spreading misinformation from being scrutinized.

September 3, 2021
Pro-Trump supporters storm the U.S. Capitol on January 6, 2021. (REUTERS/Shannon Stapleton)

On January 6, when pro-Trump supporters stormed Capitol Hill, California Republican Kevin McCarthy, then the majority leader in the House of Representatives, called President Donald Trump to ask him to appeal for calm.

According to testimony from a colleague, Trump first told McCarthy that antifas were storming the Capitol, not his supporters. When McCarthy contradicted him, Trump said, “Well, Kevin, I guess these people are more upset about the election than you are.”

McCarthy, who remains a Trump supporter in spite of that presumably difficult conversation, has been opposing efforts by Congressional Democrats to investigate the events of that day ever since, playing his part in a grim partisan struggle.

His most recent move — threatening the tech giants — puts him at the heart of the transparency struggle between law makers and the social media giants.

McCarthy put out a statement on August 31 urging 35 telecommunications and social media companies to ignore a request from a Democrat-controlled committee asking them to retain phone and message records of people involved in the rally. Democrats no doubt want to find a smoking gun linking Republican law makers to the violence, which, naturally enough, McCarthy would not like them to find.

McCarthy says, without evidence, that the records request is illegal, and threatened the firms in his tweet: “If companies still choose to violate federal law, a Republican majority will not forget and will stand with Americans to hold them fully accountable under the law.”

This means the companies may face Republican wrath if they comply with what appears to be a lawful request.

It will be particularly interesting to see how Facebook handles the matter, because it is also facing scrutiny and criticism from Democrats for its role in spreading misinformation, both about the election and about COVID-19.

On August 26, the committee looking into the Capitol riot sent a three-page letter to Mark Zuckerberg demanding records — by September 9 — about misinformation spread on the platform leading up to the riot, including data about Facebook’s algorithms and what actions the company took to shut down misinformation.

It looks like a headache for the tech giant — the kind of request that could pierce the wall of secrecy that prevents law makers and researchers from evaluating Facebook’s role in spreading fake news that leads to riots or increases vaccine resistance — something the company appears to take pains to avoid.

If Facebook complies with the order, the resulting data dump may fill in huge gaps that have made it difficult for journalists, researchers and law makers to understand the extent of the misinformation that is spread within Facebook’s walled garden, where harmful and false communication takes place in private groups, shifting public opinion in ways that were impossible in earlier media ecosystems, leaving, for instance, public health officials struggling.

Consider ivermectin. Despite the lack of evidence, many Americans have become convinced that the drug is an effective treatment for COVID-19, but it is not available without a prescription, unless you go to a feed store, where it is available in large quantities as a treatment for parasites in horses. Enough people have been ingesting horse medicine that the Mississippi Poison Control Center had to issue a warning bulletin. The FDA tweeted, “You are not a horse. You are not a cow. Seriously, y’all. Stop it.” Even Health Canada had to issue a safety alert.

Ivermectin has been promoted out in the open by Fox News personalities and podcaster Joe Rogan, who used it to “treat” a case of COVID-19 out in the open, but it has also been heavily promoted on social media in ways that nobody can observe.

Moderation — the process of vetting policy-violating posts — is a cost centre for the platforms. Every dollar they spend on that is a dollar that does not end up on the right side of the balance sheet. So there is reason to wonder if Facebook even wants to know what’s going on and how much misinformation is being spread. 

It is curious, for instance, that thousands of posts related to the January 6 riots recently disappeared from Facebook’s transparency system.

It is also noteworthy that Facebook recently shut down the Facebook accounts of researchers looking into political ads, reined in an internal transparency exercise, and buried a transparency report that showed large-scale misinformation, until it was leaked.

The pattern seems clear. Facebook is engaged in an energetic struggle to stop its role in spreading misinformation from being scrutinized.

The US Congress has the power to stop Facebook in its tracks, to force it to reveal what role it has played in spreading misinformation about the 2020 US election and the ongoing pandemic, but it is not yet clear whether they will actually manage to do it.

If the Democrats fail to pierce Facebook’s wall of secrecy, it will continue to be impossible to reach any conclusions about the scope of the problem, which seems like an unacceptable outcome given the centrality of the platform in crucial struggles over public health and public safety.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Stephen Maher is a Harvard Nieman Fellow and a contributing editor at Maclean’s.