The Trump Deplatforming Distraction

January 21, 2021

It’s only been two weeks since Donald Trump’s followers attacked the US Capitol, but for Trump, it must have felt like a lifetime. Many of his supporters in the Grand Old Party have turned against him. He’s been impeached for a second time. And, one after another, some to greater degrees than others, the “big four” tech companies — Facebook, Twitter, Google and Amazon — blocked the President of the United States from posting on their platforms.

At best, the deplatforming could be viewed as corporate American actors taking swift and prudent action to protect their country from further insurrection. It could be that these companies are finally taking what Mark Zuckerberg calls a “broader view of their responsibility,” evolving from their youthful pretense of neutrality to a more mature stature as responsible gatekeepers.

More jaded onlookers might see this moment as a deference to the incoming administration during the final hours of a president whose presence the companies had hosted on their platforms for four years. Banning Trump at the moment his successor is confirmed and the three branches of government change hands could, without much of a stretch, be seen as political opportunism by the same companies that have long embraced government power around the world.

Among conservatives, there was a predictable, First Amendment backlash to this. But there were other people, too, expressing their discomfort with these moves, people who rarely agree with the commentators on Fox News — Angela Merkel, for example.

Now, Angela Merkel is no fan of Donald Trump. But the German chancellor expressed concern with how these decisions were being made: unilaterally, by a handful of big technology companies, and without any clear rubric to explain how they made them. Or, more importantly, how they’ll be making them in the future. Merkel is not calling for unbridled free speech; she is calling for democratic governance of speech.

And she is not alone. In fact, the banning of Trump’s accounts actually distracts from the systemic failures that the bans represent, and may in fact hasten the very governance agenda the bans were intended to quell.

These events demonstrate the failure of the content moderation policies of the major platforms. These policies and systems have the veneer of technocratic due process, but in reality, in all their variation, they operate in a largely ad hoc manner, applying policies differently to their billions of users around the world. As legal scholar evelyn douek observed, it is far from obvious how Trump’s posts this week are worse than posts he’s made previously, and entirely unclear why he should be banned while others such as the Taliban’s official spokesperson, India’s Prime Minister Narendra Modi and the Philippines’ President Rodrigo Duterte all have accounts in good standing.

A focus on the banning of Trump’s accounts also distracts from the full-stack nature of the deplatforming actions taken since January 6 and the concentration of power they represent. Parler recently emerged as a more free-speech-leaning Twitter-clone for a largely US conservative audience and played a major role in the organization of Stop the Steal and the violent events of January 6. With 12 million users, it could potentially have grown exponentially if Trump and company moved to it following their bump from Twitter. Google and Apple removed the Parler app from their App stores and Amazon’s Amazon Web Services (AWS) kicked Parler off its servers.

The deplatforming of a platform also demonstrates the consequences of market concentration. It illustrates how the companies that own the infrastructure can make unilateral decisions that affect the other companies dependent on it. And if you don’t care about Parler being banned, recall that in 2010, as digital activist and author Jillian York pointed out, AWS banned WikiLeaks after the US State Department expressed “no more than concern” about the site.

Cherry-picking the individual posts or pages to censure is a “solution” that confuses the symptoms for the disease. QAnon is a not a movement that arose organically; it grew as a function of the design of the platforms themselves. For example, an internal Facebook study found that 64 percent of people joining extremist group pages were pointed there by the platform’s recommendation tools. And, as a New York Times report detailed, in the days following the November 2020 election, every 10 seconds another 100 people joined the Stop the Steal Facebook group on the basis of Facebook’s algorithm, expanding the group to 320,000 members in 20 hours.

At the root of this structural problem is the business model of platforms, which use vast amounts of data about users to determine how to target content designed to change our behaviour (to sell ads), and how to hold our attention (so we see more ads). Because each transaction is so small, this business model demands that it run at a mind-boggling scale to be profitable; Facebook alone handles more than 100 billion transactions a day. At this scale responsible and effective content moderation is simply unmanageable, so platforms rely on highly imperfect AI to filter their content and to decide what is seen and by whom. And it is this act of algorithmic determination that has created the communities that have too often seeded division and hate. And because these companies have become so large, we can no longer rely on the free market to correct for the harms they might be causing. The result of the business model, scale and market concentration is a systemic failure.

If the market can’t solve this problem, then the Trump bans ultimately distract from the real solution: democratic governance. In the interests of political expediency and a hope that the market could self-regulate, governments have abdicated the governance of the digital domain to a handful of private companies, whose interest may or may not be aligned with the public good. This approach has failed. And so democratic governments must now lead the difficult conversation about what speech should be allowed online. They must develop accountability and transparency mechanisms for the data-driven economy. And they must re-invigorate competition in what has become a monopolistic market.

If you don’t like platforms wielding such tremendous power, then the solution is democratic governance, not more self-governance. It is only by doing the tough work of governance, not  just banning Trump’s tweets, that we will begin to address the harms so clearly on display at the Capitol.

A different version of this article first appeared in the Globe and Mail on January 14, 2021.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Taylor Owen is a CIGI senior fellow and the host of the Big Tech podcast. He is an expert on the governance of emerging technologies, journalism and media studies, and on the international relations of digital technology.