When David Skok and I sat down to brainstorm this podcast over a year ago, we were both driven by a belief that digital technologies, generally, and the social, economic and policy power of a handful of tech platforms, specifically, were on the cusp of transforming our world. He reported on these companies, and I studied and wrote about them. Too often, we both thought, discussions about platforms (such as Google, Facebook, Amazon and Apple) were framed as simply tech stories, by journalists, scholars and policy makers alike. Really, we both believe, these companies touch almost all aspects of our lives, and therefore discussion and debate about them needs to be framed in a far broader context.

We could never have imagined how the global pandemic would turbocharge this reality. The COVID-19 pandemic has accelerated many of the structural changes that were already taking root in society. Economic inequality, geopolitical fissures, the rise of illiberal rule, and historic questions of race and diversity have all been thrust into public debate in a moment of global shock. So, too, has our digital transformation. We now live and work online to an extent unimaginable before the pandemic. And as a result, as the global economy teeters on the edge of a depression, global tech platforms have boomed. A handful of companies are keeping global markets afloat and are seeing historic growth. The irony is that their impact on our society, economy and democracies is only growing, right at the time when we are beginning to seriously question the downside risks of their business models and their design.

Over the course of seasons 1 and 2 of Big Tech, David and I have had the tremendous good fortune to have 19 wide-ranging conversations about the ways in which emerging technologies are reshaping society. Many of these took place within the context of the pandemic and amid the change it was instigating. We heard from a remarkable set of guests living and thinking through this historic transition. For our twentieth episode, we decided to reflect on some of the big themes that have emerged over these exchanges. Here are a few moments from the first two seasons that really resonated with me.

Nobel laureate economist Joseph Stiglitz made the point that the Great Depression was a time of not only massive economic inequality and economic concentration, but also significant consolidation of political power. And, he says, it was this combination of economic and political influence that led to the antitrust movement. The powerful companies of the day were seen as too big and, perhaps more importantly, as too powerful, and therefore as anti-democratic. Surely companies with equivalent economic and political concentrations of power in 2020 could be viewed the same way.

In a powerful conversation about the role of Facebook in the violence and political suppression of the Philippines, Maria Ressa, a global advocate for press freedom and Rappler.com’s CEO, argued that not only can Facebook’s platform be abused by autocrats, but the design itself destabilizes democracy. As I wrote following the episode, “This is not just a case of bad users of our digital tools. The argument Ressa makes is that the actual design and structure of our digital tools are illiberal. A design optimized for engagement over truth and prioritizing virality over the quality of information will create the very fractured and unstable media ecosystem needed for autocratic rule. Building a financial model on the unaccountable targeting of human behaviour and based on detailed models of our lives is itself an act of inherently illiberal intent. And a system that replaces the editorial, funding and distribution systems of the free press with the algorithms and incentives of the newsfeed creates the relativistic information environments in which propaganda thrives.

“The result, according to Ressa, is that ‘Facebook has enabled the rise of these populist authoritarian-style leaders who are then able to gain more control as society gets further splintered apart, and then use formal powers given to them by governments.’”

Indeed, a lot of our conversations have ended up focusing on Facebook — and, in particular, the challenges of governing speech online. We spoke to the great legal scholar Kate Klonick following her time researching the formation of Facebook’s “Oversight Board.” Foreshadowing a period of intense global debate about the role of the company in shaping our public sphere, Klonick described the design of a private governance model to be deployed by a company shaping the speech of 2.2 billion users.

In juxtaposition to the algorithmic filtering of Facebook, we discussed an alternate model of content moderation with Wikimedia Foundation CEO Katherine Maher. As Maher pointed out, there is a difference between trust and reliability: “Wikipedia is not meant to be truth. It’s simply what we can sort of agree upon as general consensus about an issue at any point in time.” Instead of engagement-based algorithms, Wikipedia uses people who are prioritizing reliable information. If this model ensures the quality of information, it is hard to imagine scaling it to the volume of content that Facebook needs to moderate. Perhaps the implication is that Facebook is simply too big to effectively manage?

These are just four moments that jump to mind in reflecting on two seasons of conversations.

One final note. This episode is David’s last on Big Tech. It has been an absolute joy and privilege to have these conversations with him, to learn from him and to jointly explore these topics that we both fundamentally believe are transforming our world.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.
  • Taylor Owen is a CIGI senior fellow and the editor of Models for Platform Governance. He is an expert on the governance of emerging technologies, journalism and media studies, and on the international relations of digital technology. 

About the Podcast

Making sense of our world. Online and off.