The Social Media Council: Bringing Human Rights Standards to Content Moderation on Social Media

October 28, 2019
I

ncreasingly, we turn to social media platforms to access the information and ideas that structure the agenda and content of public debates (Newman et al. 2019). Giant social media companies have elevated themselves to a position of market dominance where they hold a considerable degree of control over what their users see or hear on a daily basis. We may know that content moderation and distribution — in other words, the composition of users’ feeds and the accessibility and visibility of content on social media — happen through a combination of human and algorithmic decision-making processes but, overall, current practices offer very little in terms of transparency and virtually no remedy to individual users when their content is taken down or demoted (ARTICLE 19 2018, 15).

This situation has become a major issue for democratic societies. The responsibilities of the largest social media companies are currently being debated in legislative, policy and academic circles around the globe, but many of the numerous initiatives that have been put forward do not sufficiently account for the protection of freedom of expression and other fundamental rights. There is a strong consensus among international experts on freedom of expression that the mere regulation of speech by contract (that is, a company controlling its own platform on the basis of terms of service and community standards) fails to provide adequate transparency and protection for freedom of expression and other human rights (ibid.). The creation of content moderation duties in legislation — exemplified by Germany’s Network Enforcement Act — tends to lead to the creation of systems where private actors are tasked with applying criminal law and other national legal provisions under short deadlines and the threat of very heavy fines (ARTICLE 19 2017). These systems increase the fragmentation of legal obligations for social media companies, creating a situation where individual users have little or no remedy to address hasty content removal and providing no guarantee for the protection of individual freedoms.

This situation has become a major issue for democratic societies.

Media landscapes and the diversity of roles fulfilled by tech companies have been evolving at a high pace and will continue to do so. Democracy now requires that we engage in a collective learning process to organize online content moderation in a manner compatible with the requirements of international standards on freedom of expression. From this perspective, the need for a mechanism capable of ensuring an effective public supervision of content moderation on social media platforms is increasingly recognized on all sides.

ARTICLE 19, a leading free speech global organization, has proposed the creation of the “Social Media Council” (SMC) — a model for a multi-stakeholder accountability mechanism that would provide an open, transparent, independent and accountable forum to address content moderation issues on social media platforms on the basis of international standards on human rights. The SMC model puts forward a voluntary approach to the oversight of content moderation: participants (social media platforms and all stakeholders) sign up to a mechanism that does not create legal obligations. Its strength and efficiency rely on voluntary compliance by platforms, whose commitment, when signing up, will be to respect and execute the SMC’s decisions (or recommendations) in good faith. This proposal was endorsed by UN Special Rapporteur David Kaye, who recommended in April 2018 that “all segments of the ICT sector that moderate content or act as gatekeepers should make the development of industry-wide accountability mechanisms (such as a social media council) a top priority” (UN General Assembly 2018, para. 72).

ARTICLE 19 initially envisioned the SMC as having an ambitious scope: a network of national or regional SMCs entrusted with providing general guidance to social media platforms and deciding individual complaints brought by individual users, operating on the basis of international standards on human rights and coordinating through the mediation of an international SMC. Such multi-stakeholder, transparent, accountable and independent fora could weave freedom of expression within all aspects of online content moderation and distribution across all social media platforms, from integrating international standards in decisions to delete or demote content, to ensuring exposure to the broadest possible diversity of information and ideas through a form of human-rights-optimized algorithmic distribution.

The SMC model puts forward a voluntary approach to the oversight of content moderation.

ARTICLE 19, together with the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression and Stanford University’s Global Digital Policy Incubator, submitted this proposal to a working meeting of academics, civil society organizations and social media companies, which generated intense discussions, as the resulting conference report records (Global Digital Policy Incubator, ARTICLE 19 and Kaye 2019). This conference and other subsequent meetings have helped shed light on the questions, big and small, raised by the project of creating an SMC — see, for instance, the comments from the Electronic Frontier Foundation (McSherry 2019). There are different visions of what the exact roles and functions of this new mechanism should be, where it should be set up or how it would interact with other initiatives, such as the creation of an oversight board by Facebook (ARTICLE 19 2019).

A first point of discussion is the choice of rules that should preside over the oversight of content moderation. While there is a growing consensus that international standards on human rights provide the appropriate universal legal framework, there may be different ways to apply this body of rules. The SMC could simply refer to these rules directly, and the authoritative interpretation by international and regional courts and special mechanisms would provide all necessary guidance to inform the SMC’s decisions. Another option would be to adopt a code of human rights principles for content moderation. The specific adaptation of international standards to online content moderation through the adoption of a code would ensure that the SMC operates under stricter guidance than the broad reference to international standards. In both cases, as is generally the situation with the application of international standards, a certain “margin of appreciation” — or margin of flexibility — would be part of the SMC mechanism. This flexibility would allow a differentiation in the application of international standards between different companies and their respective products (for example, Facebook is different from Twitter). It would also make room for companies to adopt their own views on the speech that is allowed on their platforms, although market dominance would result in a narrower margin of manoeuvre in this respect.

Another point of divergence is whether the SMC should have an adjudicatory or an advisory role. In an advisory capacity, it would provide general guidance to social media companies on the compatibility of terms of service or community standards with international standards on human rights. In this configuration, the SMC would be an open forum where stakeholders could elaborate recommendations or observations. The alternative would be to give the SMC the power to review individual decisions; the council would then have to decide whether, in the particular circumstances of a case, the decision made by the social media platform conformed to the requirements of international human rights standards. Such a mechanism should be accessible to all. There should also be clear and precise rules of procedure on questions such as admissibility conditions, time limits, admissibility of evidence, elements covered by confidentiality, exchange of arguments and views, elements of publicity, and the adoption and publication of decisions.

ARTICLE 19 discussed the various possible orientations of an SMC, as well as some more technical issues such as the rules of procedure or the funding mechanism, in a background paper supporting a current online consultation.1 ARTICLE 19 considers that the different visions for the SMCs are not mutually exclusive: they could be designed to be complementary. In that perspective, the question is not so much whether SMCs should be set at the global level or the national level — there are strong arguments for each — but how they could all work together. The local SMCs, anchored in the local context, with its members very familiar with the complexities of the linguistic, social, cultural, economic and political circumstances of the country, would bring an increased credibility to the whole system by producing a nuanced understanding that a distant, international forum cannot reach, and it could develop solutions adapted to the local context. And the global SMC would bring a sense of universality to the system: it would elaborate a universal code of human-rights-based principles on content moderation, and it would provide a framework for national SMCs to resolve divergences. It is possible that a local SMC could bring valuable local expertise to the oversight board that Facebook is building, should that particular experiment prove compatible with international standards on human rights; a memorandum of understanding between the oversight board and a local SMC could provide a framework within which the board could seek specific insights from the local body.

At the moment, there are various legislative initiatives that would rely on self-regulatory mechanisms within a legal framework of co-regulation, under the guise of bringing a swift end to the dissemination of often vaguely defined harmful content. The SMC offers a model that can deliver a form of co-regulation that fully ensures the protection of the fundamental right to freedom of expression. Moreover, the SMC model offers a stopgap between the state body charged with overseeing the self-regulatory mechanism and the social media companies, without which companies are likely to apply mechanisms and execute decisions that do not comply with international human rights standards.

The SMC is not the only idea that seeks to deal with the issue of content moderation as a matter of urgent democratic importance — see, for instance, the proposal for a moderation standards council (McKelvey, Tworek and Tenove 2019) and the model from Global Partners Digital (Bradley and Wingfield 2018). Not only is this question dans l ’air du temps, it is also emerging at the exact point of convergence between the goals and interests of human rights groups and those of social media platforms: avoiding the pitfalls of harsh legislative approaches that often come with disproportionate sanctions; contributing to restoring trust from users through transparency and accountability; providing an effective yet adaptable form of regulation that can easily accommodate the constant evolution of tech platforms; and ensuring that moderation of speech is done on the universal grounds of international law. ARTICLE 19 is urging interested members of the public to be involved by exploring its presentation on the SMC and to share thoughts in a public survey.2 Now is the time to help us shape the future of social media regulation.

 

Author's Note

Views in this article do not necessarily reflect the positions of ARTICLE 19.

  1. Readers are invited to view the consultation paper and complete the consultation survey at www.article19.org/resources/social-media-councils-consultation/; the survey closes November 30, 2019.
  2. Please visit www.article19.org/resources/social-media-councils-consultation/ for more information and to complete the survey before November 30, 2019.

Works Cited

ARTICLE 19. 2017. “Germany: The Act to Improve Enforcement of the Law in Social Networks.” Legal analysis. London, UK: ARTICLE 19. www.article19.org/wp-content/uploads/2017/09/170901-Legal-Analysis-German-NetzDG-Act.pdf.

———. 2018. “Side-stepping rights: Regulating speech by contract.” Policy brief. London, UK: ARTICLE 19. www.article19.org/wp-content/uploads/2018/06/Regulating-speech-by-contract-WEB-v2.pdf.

———. 2019. “Facebook oversight board: Recommendations for human rights-focused oversight.” March 27. www.article19.org/resources/ facebook-oversight-board-recommendations-for-human-rights-focused-oversight/.

Bradley, Charles and Richard Wingfield. 2018. “A Rights-Respecting Model of Online Content Regulation by Platforms.” May. London, UK: Global Partners Digital. www.gp-digital.org/content-regulation-laws-threaten-our-freedom-of-expression-we-need-a-new-approach/.

Global Digital Policy Incubator, ARTICLE 19 and David Kaye. 2019. Social Media Councils: From Concept to Reality. Conference report, February. Stanford, CA: Global Digital Policy Incubator. https://cyber.fsi.stanford.edu/gdpi/content/social-media-councils-concept-reality-conference-report.

McKelvey, Fenwick, Heidi Tworek and Chris Tenove. 2019. “How a standards council could help curb harmful online content.” Policy Options, February 11. https://policyoptions.irpp.org/magazines/february-2019/standards-council-help-curb-harmful-online-content/.

McSherry, Corynne. 2019. “Social Media Councils: A Better Way Forward, Window Dressing, or Global Speech Police?” Electronic Frontier Foundation, May 10. www.eff.org/fr/deeplinks/2019/05/social-media-councils-better-way-firward-lipstick-pig-or-global-speech-police.

Newman, Nic, Richard Fletcher, Antonis Kalogeropoulos and Rasmus Kleis Nielsen. 2019. Reuters Institute Digital News Report 2019. Oxford, UK: Reuters Institute for the Study of Journalism. www.digitalnewsreport.org/.

UN General Assembly. 2018. Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression. A/HRC/38/35, April 6. https://undocs.org/pdf?symbol=en/A/HRC/38/35.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Pierre François Docquir is a researcher and expert in the fields of human rights law and internet and media law and regulation. He is the head of the Media Freedom Programme at ARTICLE 19 after joining the organization in 2015 as the senior legal officer.