Disentangling Digital Regulation

May 30, 2019
AP_19120783810986.jpg
Attendees take a selfie in front of a Facebook sign at F8, the Facebook's developer conference on April 30, 2019. (AP Photo/Tony Avelar )

Earlier this week, Canada hosted hearings of the International Grand Committee on Big Data, Privacy and Democracy, which provided a platform for law makers from 14 countries to question experts and technology company executives about a range of issues, including privacy, data ownership, disinformation, electoral integrity and online extremism. The hearings followed the release of the Government of Canada’s Digital Charter, which articulates 10 principles to guide the development and regulation of digital platforms and companies across a similarly broad range of concerns.  

The volume and breadth of government activity on challenges to democracy, society and the economy raised by digital platforms are encouraging. Many of the big tech companies have been operating as if governments and citizens were either not concerned or not empowered to address their behaviour and impact. However, while the spike in government activity is a step forward, efforts to package and address all the issues together and all at once could create problems. Comprehensive initiatives have a place in monitoring and addressing risks posed by digital platforms, but they can also stretch attention too thin and undermine concrete efforts to manage specific challenges.    

Neglecting Good Work

In some cases, the Digital Charter and other activities ignore very important work that is already under way and that the government itself supports. Consider the challenge of countering violent extremism (CVE) online. The Digital Charter’s ninth principle states that “Canadians can expect that digital platforms will not foster or disseminate hate, violent extremism or criminal content.” After the mass shootings at mosques in Quebec City in 2017 and Christchurch in 2019, and the Toronto van attack of April 2018, the government understands the real-world pain that online extremism and radicalization can foster.

Yet, Innovation, Science and Economic Development Canada (ISED) — the ministry leading the work on the Digital Charter — seems to have forgotten about what its colleagues in Public Safety Canada have been doing on the issue for years. Nowhere among the 30 specific programs listed to highlight the charter “in action” are the government’s own critical CVE initiatives: the Canada Centre for Community Engagement and Prevention of Violence, the National Strategy on Countering Radicalization to Violence and the very promising Canada Redirect initiative pursued in partnership with Moonshot CVE. That ISED neglected these concrete initiatives when characterizing the Digital Charter’s actions raises concerns about whether the charter a rigorously developed and carefully articulated strategy or merely a collection of talking points for the October election. Likely it’s a bit of both, but if the ministry skews more to talk than strategy, its results could suffer.

Oversimplifying Complex Issues

In other cases, the government’s effort to cover everything in one initiative risks oversimplifying complex and sensitive issues. The charter’s eighth principle states that “the Government of Canada will defend freedom of expression and protect against online threats and disinformation designed to undermine the integrity of elections and democratic institutions.” While the principle hits some notes likely to resonate with many Canadians, it is unclear how — or even if — the government will distinguish between expression that should be protected and expression that must be limited because it could undermine elections or democratic institutions.

The challenge was illustrated at the hearings of the grand committee during one specific exchange between Facebook’s Public Policy Director Neil Potts and British Member of Parliament Damian Collins. Noting that other platforms had removed an altered video of US House Speaker Nancy Pelosi that made her appear impaired, Collins demanded to know why Facebook had not done the same. Potts remarked that “this points to the complexity of the issue.” Collins replied, “Actually, it points to the simplicity. This video is false and you’re allowing it to spread.” But it’s not an issue with clear lines. Liberal democracies routinely, and often correctly, protect political satire and false (but non-libellous) speech. Canada’s Digital Charter appears to neither recognize nor engage with the tension. 

Disentangling Digital Regulation

We face a tangled web of digital challenges: threats to privacy, disinformation, digital divides, online extremism. Surveying all challenges at the same time makes it difficult to focus on the details of specific issues and the nuanced strategies we need to address them. It’s easy to say that the big tech companies are doing many things wrong and failing to address the harm they generate or facilitate. It’s much harder to develop evidence-based diagnoses and initiatives to address concrete challenges. Although the Digital Charter gives the appearance of coordinated action on the challenges posed by digital platforms, it risks distracting attention from the practical work that is being done on specific issues.

Perhaps attention and effort would be better spent examining specific programs, asking what works (or doesn’t) and engaging with digital platforms to scale effective interventions.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Daniel Munro is a Senior Fellow in the Innovation Policy Lab at the Munk School of Global Affairs and Public Policy at the University of Toronto, and Co-Director of Shift Insights.