The growing list of companies boycotting advertising on Facebook, now including major corporations such as Unilever, Ford, Hewlett-Packard and Adidas, is an unambiguously positive development. Last year, Facebook made 98.5 percent of its US$70 billion in revenue from ads, so there is no doubt that they are paying attention. In response, Facebook and a range of other social media companies are beginning to shift their policies on how they treat hate speech, misinformation and radicalization.

But we should be cautious about outsourcing the governance of speech in our society to either a global platform company that sits largely outside of our jurisdiction, or to the whims of global brands who momentarily see their interests aligned with the public good.

That a boycott was even necessary is not a success of self-governance, but rather a failure of democratic governance. And the minor policy changes that are likely to result from this boycott will not address the core structural problems in our digital infrastructure. There are no market solutions to problems caused by a market failure. Instead, the most meaningful response to this problem would be for governments to develop new competition, data and content policy suited to the digital world.

First, without better competition policy, this moment won’t last. Time and again, after a scandal hits Facebook, the market returns a clear signal: users and advertisers will always come back. What choice do they have? Wait until Unilever executives see what happens on YouTube.

So, every time a scandal breaks, each more egregious than the one before, the company is momentarily hit with consumer or ad boycotts and minor stock dips, before revenue growth roars back. The reason both the market and consumers are so bullish on Facebook is because we have a global digital ad duopoly. If you want to place an ad on the internet anywhere in the world, you have, effectively, only two options: Facebook and Google.

As David Skok and I discussed on this week’s Big Tech podcast with Matt Stoller, the author of Goliath: The 100-Year War Between Monopoly Power and Democracy, monopolies are market failures, and it is up to governments to correct them through a wide range of competition policies, including antitrust, merger and acquisition restrictions, and enforcement of consumer harm protections. If we want the market to have any chance at correcting these problems, we need to make sure that advertisers and users have more ethical options to choose among. These options will only become available when real competition exists in the market. Publicly traded private monopolies do not self-regulate. This is why we have governments.

Second, without data privacy policy, we are not addressing the root cause of this problem: the largely automated systems, fed by data, that determine the character of our public sphere. On platforms such as Facebook and YouTube, what we see and hear, and whether we are seen and heard, is determined by largely automated systems. These algorithms decide what speech is amplified, and what is not. They decide what videos we are presented with, and what private groups we should join. They decide what ads we see, and how our past behaviour can be nudged by subtle messaging.

Furthermore, the primary motive of the business model of social platforms is to keep us on their sites, and so these algorithms are calibrated for engagement. And it turns out that what engages us is not always aligned with what informs us or brings us together. Instead, too often it’s false, divisive and hateful content that drives us to click, comment and share — the very activities that make platforms money.

These algorithms are the engine of the platform ecosystem, and they are driven by the data that is harvested about our lives in a largely unregulated environment. The collecting of this data and the algorithms that use it are entirely opaque, hidden from view and unaccountable to public scrutiny. Governments must adapt their data privacy laws to the digital world and force transparency upon the systems that determine the character of our digital public sphere.

Third, without content policy, we are delegating the governance of speech in our society to a single company, making it responsible for moderating the world’s communication. No matter how Facebook, Google, Reddit and TikTok respond to this boycott, we should be wary of their decisions shaping what we can and can’t say. What speech we allow in our society is a core function of democratic governance. It is a function that is riddled with trade-offs and tensions, historical and cultural context, and the nuances of languages. How do we balance the value of free speech with the need to protect citizens from harms caused by speech? How does our past, our culture and our language affect what we allow to be said in our society? These decisions have to be made by democratic societies, not by a single company seeking a universal standard for everyone in the world and accountable only to shareholders.

Finally, this is a global problem that demands new forms of global governance. This current moment of market correction demonstrates the centrality of the United States in determining the way speech is regulated globally. Where was the boycott and the Facebook policy adjustments when Filipino president Rodrigo Duterte was using the platform to harass journalists and radicalize citizens? Where were they when genocide was being incited on Facebook in Myanmar? Where were they when Brazil’s president, Jair Bolsonaro, was spreading dehumanizing speech? We are seeing action now because of the unique political moment the United States finds itself in — with a president mobilizing hate speech and inciting violence against his own citizens. When this has happened in other countries, the boycotts and policy changes never came. We also need a global conversation about what human rights look like on digital platforms.

Of course, Facebook has done tremendous good. It enables speech, empowers civil society, allows businesses to reach audiences and connects societies in many meaningful ways. But in no other domain do we allow the positive attributes of a company to negate its accountability for those of its actions that are harmful to society.

When a market failure leads to collective social harm, we expect governments to step in. This boycott will only be successful if those pausing their ad campaigns move beyond urging self-governance and demand democratic governance.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.
  • Taylor Owen is a CIGI senior fellow and the editor of Models for Platform Governance. He is an expert on the governance of emerging technologies, journalism and media studies, and on the international relations of digital technology. 

About the Podcast

A podcast about the emerging technologies that are reshaping democracy, the economy and society.