Western-Centric Content Moderators Miss Regional Contexts

Speaker: Ivar Hartmann

June 13, 2022

Western-Centric Content Moderators Miss Regional Contexts

This video is part of The Four Domains of Global Platform Governance, an essay series that examines platform governance from four distinct policy angles: content, data, competition and infrastructure.

Regulators are principally concerned with the content found on social media platforms and how hate speech, harassment, violent extremism, and mis- and disinformation are being moderated and removed. These types of content are having real-world impacts on society, safety and democracy.

To date, much of the work in content moderation has been undertaken by the platform companies. Their Western-centric approaches to community guidelines and terms of service are lacking the nuance of regional context and social and cultural norms. No single set of content rules can apply to the entire planet.

In his essay, Ivar Hartmann looks at the state of content moderation in Latin America and how the domestic courts are better suited to define the rules of online speech in the region.

Ivar Hartmann: The greatest issues of concern when it comes to social media platforms are how to deal with hate speech, harassment, violent extremism, and mis- and disinformation. Are the platforms doing enough? What are the governments doing? And how do we strike a balance that respects free speech, creates a safe space online and doesn’t lead to a state censorship regime?

Of the four domains of platform governance covered in this series, I want to discuss the domain of content policy because how we choose to govern this space will have implications on how we interact online, how we get information and the future of democracy.

Content policy regulation is a difficult task to tackle because each country and region has unique laws, social norms and contexts, which must be considered.

In my essay, for example, I look at the state of online speech in Latin America. There is a real gap in oversight of this area of the world. Simply hiring more moderators or using artificial intelligence to filter [content] is not the answer. I argue that content moderation rules should not be managed by platforms alone. Rather, it should be regional courts and civil society organizations who have a say. It’s not perfect, as I outline in my essay, because some courts have not updated their precedents and case law to address online speech, but it’s certainly better than leaving content moderation to CEOs 10,000 km away.

For media inquiries, usage rights or other questions please contact CIGI.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

The Four Domains of Global Platform Governance

In the span of 15 years, the online public sphere has been largely privatized and is now dominated by a small number of platform companies. This has allowed the interests of publicly traded companies to determine the quality of our civic discourse, the character of our digital economy and, ultimately, the integrity of our democracies. This essay series brings together a global group of scholars working in four distinct domains of the platform governance policy discourse: content, data, competition and infrastructure.