Who Should Be Responsible for Decentralized Social Media Standards?

In a tweet, Twitter CEO Jack Dorsey suggested that a team of independent (though Twitter-funded) engineers should lead the development of new social media standards. CIGI's Robert Fay argues that this process is akin to letting the fox guard the henhouse

December 26, 2019
AP_18248729247204.jpg
Twitter CEO Jack Dorsey testifies before the House Energy and Commerce Committee on Capitol Hill, on September 5, 2018, in Washington. (AP Photo/Jose Luis Magana)

It’s hard to get excited about standards. But when Twitter CEO Jack Dorsey tweeted that the company would “develop an open and decentralized standard for social media,” it led to hundreds of fiery tweets, some supporting the initiative, some dismissing it and some downright hostile toward it. Clearly, Dorsey hit a nerve — and rightly so. Social media platforms’ business models — some of which support dangerous trends such as the dissemination of misinformation — desperately need standards. Dorsey’s tweet, and the debate that followed, largely characterized platform governance as a technical engineering problem that requires an engineered solution. But any solution to this complicated issue requires a much broader perspective.

I commend Dorsey’s stated goal. He’s right to point out that people “can’t choose or build alternatives” to the algorithms that dominate social media platforms. Further, he correctly asserts that the incentive structure of these platforms often fosters “controversy and outrage” instead of “conversation which informs and promotes health.” I also agree that a standard could help to address these, and other, digital challenges.

Dorsey doesn’t provide specifics, but his thread suggests that the standard would essentially separate Twitter’s content from Twitter’s algorithms. In effect, there would be competing social media platforms that could use Twitter’s technology, each with potentially different rules over content. Rather than have content managed and moderated by a few giants, it would be managed by many platforms that could distinguish themselves with different content management rules. Users could self-select which platform best meets their needs and views, creating competition that could lead to a form of content moderation: “good” platforms would drive out the “bad.”

But I am skeptical that this strategy would tackle fundamental issues such as misinformation: it simply transfers the responsibility — at least to some extent — away from a few large, well-resourced social media platforms that don’t really want to manage content anyway to a number of other smaller platforms. This strategy could incorporate bodies such as social media councils, which would set norms that reflect a community of views, but there is no guarantee that this would occur. And, as they do now, citizens would rely on these platforms to govern themselves through “terms of service” and other jargon-filled agreements, which very few users read or understand.

Dorsey’s tweet suggested that a team of five independent (although still funded by Twitter) engineers would lead the development of a new standard. This process is akin to letting the fox guard the henhouse and, arguably, it’s not much of a shift from the existing system of self-regulation. This governance chasm needs to be addressed, and there is a way to do it.  

The 2008 financial crisis presented one of the greatest regulatory challenges — and successes — of the past decade: global financial powerhouses operating under lax standards needed to be reined in. In response, the Group of Twenty created the Financial Stability Board to coordinate “national financial authorities and international standard-setting bodies.”

Much like the financial giants of the early 2000s, social media platforms hold tremendous global power; the establishment of a Digital Stability Board (DSB) could mitigate the tremendous risk associated with that power. The next decade should be devoted to the design of a sound global regulatory framework for social media platforms — equivalent to the regulatory efforts undertaken for global financial institutions. Everyone, including the platforms, will benefit.

The DSB would be a multi-stakeholder, inclusive body to reflect the diversity of views that need to be brought to the table in platform governance. Standards would be a significant part of the DSB’s mandate. As CIGI Senior Fellow Michel Girard argues, standards, combined with third party certification, should be considered an essential element of sound governance for digital platforms. Platform policies on content, algorithms and data consent, usage, storage and portability — among other areas — would benefit greatly from standards. Traditional standard-setting bodies such as the International Organization for Standardization and the Institute of Electrical and Electronics Engineers are already involved in this effort in some areas. There are a number of open source efforts underway as well.

However, platform governance is a much broader undertaking than standard setting. The DSB’s overarching role would be to coordinate, monitor and asses policies related to the digital sphere — standardization is just one small, albeit important, part.

At the end of the day, Dorsey’s announcement ignores the fact that addressing the shortcomings of his platform is more than a simple engineering problem. Solving the problems presented by digital platforms is a complicated undertaking that requires a cohesive, global effort. Big tech alone will not fix big tech. 

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Robert (Bob) Fay is a CIGI senior fellow and an expert in the field of digital economy research.