Transparency Is the Best First Step towards Better Digital Governance

Transparency rules are a traditional way to put pressure on companies to act in the public interest and to protect consumers.

May 5, 2022
zuk2018-05-01T173202Z_1352884926_HP1EE511CPEL5_RTRMADP_3_FACEBOOK-F8CONFERENCE
Meta CEO Mark Zuckerberg speaks at the company’s annual developers’ conference in San Jose, California, May 1, 2018. (Stephen Lam/REUTERS)

After years of letting them manage their own systems and content moderation practices with little or no public supervision, governments around the world are throwing a regulatory net over digital companies. Online regulation measures have been adopted or are pending in Australia, Canada, the European Union, Germany, Ireland, the United Kingdom and the United States. Last year’s revelations by Facebook whistle-blower Frances Haugen have prompted these governments to redouble their efforts to reduce social media addiction and the spread of harmful content. The war in Ukraine has both accelerated this push and greatly complicated it, simultaneously.

The time has come for governments to act, and transparency measures are the best first step. Governments should require social media companies to disclose more information about how they operate and how they amplify, restrict and remove content. Transparency rules are a traditional way to put pressure on companies to act in the public interest and to protect consumers without burdensome mandates setting out exactly how they should conduct themselves. Elon Musk’s Twitter takeover bid is a case in point: Greater transparency could assuage some concerns about his intentions for the platform. Sunlight, said renowned US Supreme Court Justice Louis Brandeis, is the best disinfectant.

A report released in March 2022 by CIGI is a compendium of insights and recommendations about how governments can construct transparency rules for social media companies and other services that allow users to exchange ideas and information, such as app stores or podcasts. My task as rapporteur in preparing this report was to synthesize discussions held in 2021 among government officials participating in the Transparency Working Group of the Global Platform Governance Network (GPGN). Created by CIGI in partnership with Reset, the GPGN is a global community of civil servants, legislative staff and regulators seeking a collaborative and harmonized approach to dealing with the pressing challenges of digital platform governance.

One of the Transparency Working Group’s major, and perhaps surprising, findings from these meetings was that transparency rules are not self-enforcing. Legislatures cannot simply pass a transparency law without also creating a regulatory structure to interpret and enforce the new requirements. New laws and proposals would require companies to disclose their content rules, their enforcement procedures, and their complaint and redress processes. They would require companies to report on how well their systems have worked to control harmful content on their systems and to discuss mitigation efforts. They would demand access to internal company data for qualified researchers. These extensive disclosures have to be coordinated and supervised by an agency to ensure that accurate and complete information is available to users, the public and other key stakeholders.

In the United States, the proposed Platform Accountability and Consumer Transparency Act would vest the Federal Trade Commission with authority to supervise transparency requirements for social media companies.

Some jurisdictions are already doing this. Ireland has proposed expanding the role of the Broadcasting Authority of Ireland to include a new Online Safety Commissioner that would supervise transparency rules, among other regulations. In the United States, the proposed Platform Accountability and Consumer Transparency Act would vest the Federal Trade Commission with authority to supervise transparency requirements for social media companies. Australia has upgraded the powers of its eSafety Commissioner to oversee social media companies and app stores, including increased transparency mandates.

The United Kingdom has chosen to house its social media regulator responsible for transparency and other online safety measures in its traditional media and telecommunications regulatory office, the Office of Communications. The European Union’s proposed Digital Services Act would regulate online harms throughout Europe, require member countries to designate a Digital Services Coordinator to apply the regulation, including its transparency and reporting rules, and create an enforcement role for the European Commission itself.

The consensus among the regulators in our working group was that each jurisdiction should designate a sector-specific independent digital regulator with strong supervisory powers to head the transparency enforcement effort. We thought that effective government enforcement might best be achieved through co-regulatory efforts such as industry codes of conduct, crafted with input from civil society groups, and ultimately approved and supervised by a national regulator.

A second insight arising from our meetings and included in the report was that disclosures should be targeted to specific audiences. We identified content moderation, advertising and the operation of the platform’s service as areas for disclosure: the different audiences for these disclosures would include the public, users, auditors, researchers and regulators. Some internal social media information should not, however, be made public, such as the proprietary code of operational and content moderation algorithms, since such disclosures could compromise intellectual property rights and the integrity of platform systems. Personal information of users should also be protected; for instance, the identity of users who complain about harmful material online should be available to the company, and perhaps to the regulatory agency, but not necessarily to other users or to the public at large.

Regulators have the greatest need to know and should have the strongest power to access internal company data, our report found. The public and social media users should have broad access to information about the operation of social media systems and their content moderation programs, including mandated public reports and redacted audits. But the public should not be able to access confidential and personal information. Regulators should approve auditors and independent researchers to have greater access to internal company information that they need to conduct regular assessments and ongoing investigations into company practices.

Further, we found that international cooperation will be essential in establishing workable disclosure frameworks. Companies operate on a global scale, even though they localize their content to specific national and regional audiences. It does them, their users and the public no good for regulators to operate at cross purposes in different jurisdictions. One of the key messages of our report is the urgency of achieving international consensus on a standardized way for global companies to report key information on content moderation.

Policy makers in many jurisdictions have concluded that social media companies have too much unchecked power and are failing to protect the public and their users from online harms. Policy makers in, for example, Australia, Canada, the European Union, Germany, Ireland, the United Kingdom and the United States are prepared to move forward with an ambitious reform agenda that includes focusing competition policy specifically on tech companies and addressing online safety issues.

Transparency measures are relatively low-hanging fruit in this new digital regulatory scheme, in contrast with more controversial areas such as the mandated removal of harmful but legal material. Transparency measures provide due-process protections for users, motivate companies to do a better job of content moderation, allow shared knowledge on effective techniques to counteract harmful or illegal online material, measure compliance with regulatory requirements, and provide feedback for regulators seeking to improve their regulatory programs.

No innovative regulatory regime can be perfect from the start; inevitably, a new institutional structure will be a work in progress that will have to be adjusted as experience is gathered. Regulators, industry and civil society groups will need to engage in an ongoing conversation about how to set up a flexible, agile regulatory regime that can both learn from experience and respond to evolving business and technological realities of the fast-changing digital landscape.

The era of self-regulation for social media companies is over. Policy makers are determined to establish a regulatory regime for these vital platforms for self-expression and commerce. Transparency, disclosures and openness are essential elements in this new regulatory structure.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Mark MacCarthy is an adjunct faculty member at Georgetown University. He is the author of Regulating Digital Industries: How Public Oversight Can Encourage Competition, Protect Privacy and Ensure Free Speech (Brookings, 2023).