Taylor Owen On Six Insights from Season Three

Season 3 Episode 20
Taylor Owen on Six Insights from Season Three

Taylor reflects on six themes that emerged this season in conversations about big tech’s transformation of our economy, society and lives.

S3E20 / August 19, 2021

Taylor Owen

Episode Description

This season of Big Tech has featured conversations with experts across many fields —law makers, academics, journalists, authors, activists and a bishop — who are working to address technologies’ impact on our lives. In this episode of Big Tech, Taylor Owen looks back on those conversations and highlights six themes that have emerged across the season.

Taylor begins with the topic of how the debate about tech and society is maturing, reflecting that we have moved past the superficial “tech issues” and entered into some very challenging, complicated questions around speech, encryption and anonymity.

One reason why these issues are becoming more complex is that technology’s global reach often brings it into conflict with national laws. Which is the second theme that emerged in conversations: the dynamics between, on the one hand, the layers of global interconnectedness and, on the other, a fractured system of regulations between the United States, Europe and China.

Many guests of the show highlighted a third theme, the materiality of technology and its impact on our planet. Kate Crawford, for example, discussed how rare earth minerals and server farms are having a lasting impact on climate.

In our rush to connect the world, we have created many vulnerabilities, both digital and physical. The fourth major theme this season was around surveillance — either by ad tech companies, policing or repressive regimes — and cyberattacks on our critical infrastructures.

The fifth theme is recognizing capitalism’s and the tech giants’ business model as at the root of many of the tech-related harms and risks users face today. Those financial incentive structures have created environments that prioritize users’ engagement with content above all, even if that content is false or misleading, harmful or inciting violence.

Finally, our experts presented solutions and optimism in their conversations. While grappling with these problems can be a daunting task, the sixth theme this season is that there are paths forward, and many people committed to finding ways to make technology work for the betterment of society.

Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

Taylor Owen: Hi, I'm Taylor Owen, and this is Big Tech.

When I started this podcast, I did sit with a desire to create a space, in my intellectual life for longer thoughtful conversations, about a topic that dominates much of my professional life, which is the way that technologies are pretty fundamentally transforming our economy, our societies, our politics, and our personal lives. I engage a lot in this debate, and I have for many years working both to understand these technologies and the effect they're having on us and what we as societies can do to minimize some of the harms and risks that are stemming from these technologies. But I found that this debate was increasingly reduced to very reductive arguments about the goods and ills of technologies, critics pointing out all the harms platforms, defending them. Um, and this was all happening in a form that wasn't very conducive to more thoughtful deliberation. And I missed some of that intellectual activity in that engagement that used to come from the space of blogging, where people deeply invested in a topic wrestled with an issue in public. And I think that in many ways podcasting and podcast conversations have stepped into this void, where there's need for longer conversations. And, and I think that that has proven correct with this show. And the result is that personally, the podcast, the show is become one of the most generative intellectual things I do. For each episode I get to research a topic that I care about, think about how I personally view that and understand that topic and what I don't understand about it, and then speak to a world expert on it. Sometimes this is someone who has written a book on it. Sometimes it's a colleague or a friend who I have a lot of respect for, or sometimes it's someone I've admired from a distance for a long time. Many times it's all of those three at once. But these conversations also really force me to wrestle with and think through these really challenging topics. None of these issues we talk about, about all the different ways digital technologies affect our lives are easy. And I spent a lot of time working in thinking about these topics and yet coming out of these conversations, every time I find I've learned just a tremendous amount. And so I wanted to spend this last episode of the season reflecting a bit on some of the broader themes and observations I've made and things I've learned from the arc of these conversations as a whole. So here are six things that I've pulled from these 19 conversations with an incredibly rich smart group of people that I've been lucky enough to speak with.

The first observation I want to draw from these series of conversations is that the debate about technology and society and all its complexity and richness is maturing. I spent a lot of my time in my academic and professional life researching the ways in which technologies are affecting our society and the potential harms and all of the benefits that come from them. And increasingly working on the potential governance strategy for maximizing these benefits and minimizing these harms. This has involved spending years initially raising awareness of these harms, I and other researchers were seeing, to highlighting them to the public and to policy makers and to urging governments, to pay attention to these harms, to not just look at the upside of these technologies, but also to look at the risks and to do something about them, to step into this space and to govern in a way they hadn't before governments had largely left this space and governed. And now governments are, they're increasingly paying attention. They're concerned about the harms society as a whole is increasingly concerned about what they're seeing as some of the risks and harms embedded in these technologies. And so we're now embroiled in a set of really tough debates about how we're going to govern this space. And they're tough because there's no silver bullets, there's no clear answers. There's no one thing that will solve this whole host of economic, political, social, human, community harms that have unfortunately stemmed from the way in which we use some of these technologies. And these are particularly difficult because there are some core trade-offs and core democratic trade-offs and tensions between rights embedded in these policy discussions. For example, as Jameel Jaffer pointed out in our conversation about free speech. Free speech is always in democratic societies a tension between competing rights, the right for individuals to speak freely and openly in democratic societies and the right for individuals in those same societies to not be harmed by speech. It is not as easy as simply saying, there is bad speech on platforms and therefore we should get rid of it. And so when we are making decisions about governing online speech, we are trading off those rights and we have to, as a society where we're going to land.

[CLIP – Big Tech, Season 3, Episode 19]

Jameel Jaffer: I mean, one of the reasons why it's complicated, so we can't agree on what speech should be taken down. Like beyond the stuff that is illegal, then there isn't a consensus on, you know, where the line should be, but even if there were a consensus, you'd still have to worry about the chilling effect of these kinds of laws.

Taylor Owen: Encryption and anonymity is another such tension. Anonymity line affords individuals and groups tremendous rights and empowers them in really meaningful ways. It gives voice to people who would otherwise be persecuted in society. It allows individuals and people to express themselves in ways they might not, if they were publicly disclosed. It allows journalism and whistleblowers to function, to hold our societies accountable. And yet tremendous harm is also done under the cover of anonymity. We know that violence is organized and perpetrated, that black markets function and then a host of activity that challenges democratic norms and democratic society is enabled by anonymity. So what do we do about that?

[CLIP – Big Tech, Season 3, Episode 3]

Ron Deibert: Those who come forward and say, "I've got a simple answer for this," you know, arent really, i think, leading a lot, um, because there is no simple answer, it's a complicated issue. And there are real challenges for law enforcement and intelligence agencies to do the work that they do. Um, my view is that, you know, first of all, the debate is often incorrectly portrayed as between security versus privacy. When in fact it's about two different versions of security. And we cannot sacrifice one for the other.

Taylor Owen: And I think what I've drawn from these conversations so far this year, is that the very challenge of these conversations is a sign not of the weakness of this debate. Many people want to use the argument that there's no silver bullet to argue that nothing should be done because no one thing fixes everything. We shouldn't do anything. And this is a very deliberate strategy to obfuscate the governance possibilities in this space. But to me, the sign that we're wrestling with these in public, the public's are debating it. The governments are debating them, is a sign that debate is maturing, not that it's stagnating. We're wrestling with the difficult policy decisions and tensions and trade-offs that are core to actually getting this governance agenda right.

The second observation I would make is that these issues and debates aren't just global in that the internet has globalized our communications and have users around the world. But that there are layers of interconnectedness between the debates we're having about these technologies in different countries around the world, in different types of governance regimes around the world and in the ways in which technologies are used in different ways in different societies. The global nature of these platforms is bumping up against the way we govern via national law. Pranav Dixit, an Indian journalist spoke really powerfully about the global reach of Silicon valley and how it had shaped his own life and his own engagement with his society, but also highlighted the real challenges that are facing citizens in what are increasingly illiberal leaning regimes like India, where the Indian government is using very similar language to what Western democratic governments are using to crack down on speech and political dissent in Indian society. So countries of course have a right to govern themselves -- and I certainly want my country to have that right -- but what happens when that language of governance is appropriated for illiberal ends? How should we understand the global nature of those regulations when the same kinds of policies can be used for radically different ends depending on the country that is imposing them. And is it the case that in some countries around the world, the values of some of these platforms, which I may or may not agree with are actually far more democratic and liberal than the values being imposed by certain states?

[CLIP – Big Tech, Season 3, Episode 15]

Taylor Owen: I mean, would you, would you essentially rather Facebook be determining the speech of Indian citizens or the Indian government?

Pranav Dixit: No, I definitely don't want Facebook deciding the speech of Indian citizens, but we're in a time when you also don't want the government to be in charge of all that, right?

Taylor Owen: Should we in democratic societies limit our governance of these platforms so that the base rights they provide empower the rights of individuals in less democratic societies around the world? Is that how we should be thinking about the global nature of these platforms? Or do we, as individuals in democratic societies have the right and obligation to demonstrate how this space can be governed responsibly?

A third thing that became increasingly clear to me over the course of these conversations is that we need to think about technology, not just as an ephemeral set of tools, but instead as something that is deeply material. Technology is built by people and is built of things. And increasingly both of those material elements of the technologies we use, of platforms we engage on are costing the planet. Few people understand and communicate the materiality of technology better than Kate Crawford, who I had the chance to talk to about her new book, The Atlas of AI. We like to think about things like AI as if they're automated and there's no human involvement. And that leads to this enchantment we have with these technologies that somehow they are separate or better than humans, but in fact she points out they are built by humans, huge amounts of human labor goes into these technologies.

[CLIP – Big Tech, Season 3, Episode 14]

Kate Crawford: So across, across the entire kind of ecosystem of AI, there, there are people in the background who are effectively, you know, making these systems appear intelligent. And this is, you know, part of why I say that AI is, is neither artificial nor intelligent it's made of all of these forms of human labour, um, that we don't see, um, and that we don't pay very well. And that in many cases are doing work that's physically and psychologically very taxing

Taylor Owen: In order human labour going into this, she points out. But there is tremendous energy going into sustaining these systems. The energy that goes to power server farms, the enormous amount of energy that goes to operate even a single machine learning algorithm. Every time we ask a question of an AI system, every time we use an automated system to solve a problem for us, every time we communicate or watch something online, we are using a tremendous amount of energy. And even the technologies themselves are material. They use a huge amount of rare earth minerals, which are mined around the world, that are increasingly scarce, that we don't know how long we will have access to and which bring with them the whole set of challenges that we've long faced in the extractive industry. If Kate tells us one thing, it's that we need to think about technology more holistically. It is not a magical thing that we use and it solves problems for us. It is something that is built that we, as humans built, that we build using resources that affect our climate that are contributing considerably to the other great crisis of our age, the climate crisis. And they're increasingly drawing on resources that we have limited access to. When we think about governing technology, we can not just think about the harms that are conducted using them, but we have to think about the harm done building and running them.

[CLIP – Big Tech, Season 3, Episode 14]

Kate Crawford: I mean, we're calling this, you know, the era of AI supercomputers. So we're actually making things that are more and more energy and compute intensive at the same time as the planet is under extraordinary strain. So in so many ways, I think the data economy is premised on maintaining this kind of environmental ignorance

Taylor Owen: Fourth, when we think about digital technologies, we often focus just on one layer of them. On the layer that we interact with, our Facebook news feed, our Twitter stream, our Google search bar, our phone, but we often leave out of this conversation is that final layer of technology, the layer that we as individuals and citizens interact with sits on top of a much broader technology infrastructure. On the cables and satellites that move our data around the world, on the servers where this information is stored. And it turns out when we look at this broader infrastructure, we see that it is highly vulnerable. We've built this precarious system that is increasingly the source of deep security vulnerabilities. Few people in the world know this better than Ron Deibert, who has spent the past 20 years researching these vulnerabilities and what his work points out so clearly in the conversation with him revealed is that this infrastructure we have built and that we have failed to govern and secure properly is leaving us highly vulnerable.

[CLIP – Big Tech, Season 3, Episode 3]

Ron Deibert: So if you step back for a minute and you look at, you know, putting aside social media, just the entire technological infrastructure, security's largely been an afterthought, first of all. You know, you have these legacy systems that were invented, in some cases decades ago, when you look at telecommunications technologies and the protocols that underlie it all, that have kind of been cobbled together is the way that I think about it. And on the surface, it kind of works well, functions actually remarkably well if you look at what we're doing right now, but there are all these negative externalities and gaping vulnerabilities. So when you insert malicious actors into the equation, they can exploit them.

Taylor Owen: There's also a layer of actors and programmers who are seeking to exploit the vulnerabilities in the code of our infrastructure. Nicole Perlroth, the cybersecurity reporter for the New York Times spoke of what, of this market for zero days, which are exploits that hackers around the world find, and then sell to democratic countries, to companies, to illiberal regimes, leading to a bidding war for these vulnerabilities. And one of the things she points out that's so powerful is that by participating in this market for these vulnerabilities, our governments, democratic governments are creating a substantial vulnerability for themselves and for their citizens because these exploits are not just used by them to find terrorists or to hunt criminals. Once they're in the wild, they make everyone vulnerable, as we're increasingly seen with a black market for ransomware.

[CLIP – Big Tech, Season 3, Episode 10]

Nicole Perlroth: And so where is ransomware going? Because we're just digitizing our lives. At what point is it going to be your self driving car or your insulin pump or your pacemaker? You know, there's no reason to go there yet because there's still so much money to be made with these corporate ransomware attacks, but we're just continuing to connect everything without thinking about that possibility. And that's a very real possibility. That's not just stoking fear for fear sake. That's a possibility.

Taylor Owen: Fifth, this is something that I've been concerned about for longer than we've been doing these podcasts conversations. But something that has really been reinforced to me. Is that much of the debate about how to govern these technologies focuses on the wrong thing. It focuses on the way they're used by individuals and less on the structure that enabled that activity. And that is first and foremost, what we need to understand about these technologies that they are built. They are built by people, these people have incentives, these people have biases, these people have subjectivities. And the way in which these systems are built, the financial incentives that are embedded in them, the way data is treated in them, the way they're moderated or not, all of these things shape the social and political outcome that these technologies have. I've got the chance to speak to some people who know a tremendous amount about these root causes. Mutale Nkonde, one of the most thoughtful people in the world on the role of race and technology, speaks about the fundamental design decisions in these technologies would lead to discrimination, to abuse, that is disproportionately felt by people of colour online, that allowed adversaries and nefarious actors like the Russian government to exploit divisions in our society around issues of race in highly effective ways.

[CLIP – Big Tech, Season 3, Episode 9]

Mutale Nkonde: So the way that social media algorithms work in terms of race is if you engage with content, that's click, share or comment, right? Those are three types of engagement, and you're going to be served up that content again, because the algorithm is a pattern matching mechanism. And what it does is to look to match you with similar content. And what the Mueller report found in 2016 was people that were clicking on content that was, uh, promoting black lives matter messaging, were then also being fed content that was encouraging them not to vote.

Taylor Owen: And Joan Donovan, one of the world's experts on mis- and disinformation, eloquently step back this structural argument to make what I think is a critical point. That these structural incentives are what determine the regulations that platforms are willing to accept that they're now saying they want and the ones they do not. And that we as societies can not be blinded by the policies that these companies say they want and need to focus clearly on ones that address these structural problems. Some platforms are advocating very loudly for policies at the moment to regulate them. They want clarity over what illegal speeches, so that they can train their AI's to be more efficient, for example. But they do not want competition policy that might break them up or limit their market dominance. They do not want real privacy laws. They do not want governments or regulators to have the power to regulate their algorithms, to bring transparency to the way their systems work.

[CLIP – Big Tech, Season 3, Episode 5]

Joan Donovan: And the fact Of the matter is, is that when these companies say, uh, we welcome regulation, they mean of a certain kind and type and one that specifically doesn't require them to either break up their businesses or, uh, create limits for the kinds of profit that they take in or profit sharing downstream.

Taylor Owen: Another way of saying this is, they do not want structural solutions, they don't want to address the root causes. Because the root causes are a function of their business models and reversing them will reduce their profits, will reduce their value and ultimately reduce the core responsibility they have, which is to their shareholders. And we know that publicly traded monopolies do not self-regulate. And so they have very little incentive to address these core structural issues. But these structural issues are critical for democratically governing the digital public sphere. Which brings me to my final observation, which is how many people I got to speak to stressed that the free market will not solve this problem when a market fails to self-regulate. And when this failure leads to social and economic harms, that is precisely when we expect governments to act and to step in. But many of the guests this season pushed me to go further on this, to think further and more broadly about what this means. Victor Pickard argued that the failure of journalism is not just a market problem that can be fixed by better competition policy. Interestingly as another guest, this season, Rod Sims argued for in Australia, Victor argues, we can't just fix the market problem and hope that journalism will be saved because journalism is a systemic market failure. There is no business model for journalism he argues that can sustain its democratic responsibility.

[CLIP – Big Tech, Season 3, Episode 16]

Victor Pickard: You know, for purists neoclassical economists, they can just say, well, we just need to do a few little tiny, uh, you know, nips and tucks here. It'll be fine. And I do. I think that's the wrong way of looking at it. I think clearly we're seeing something that is irredeemable, especially for providing local journalism. We don't need to shore up these commercial models. We need to create some kind of public alternative here. Otherwise we know what's going to happen, we're going to keep watching the market drive journalism into the ground, really making the case for these radical arguments that this is what happens if you just leave it up to the market.

Taylor Owen: Ethan Zuckerman argues for a civic model of the internet, that we need to think about the internet, not as a purely capitalist free market enterprise that provides goods and services for users, but something that provides civic goods that provides public goods to citizens. And if we do that, we need to consider he argues public options to some of these tools.

[CLIP – Big Tech, Season 3, Episode 12]

Ethan Zuckerman: That's the spirit of digital public infrastructure. The spirit of digital public infrastructure is first and foremost, the internet is too important to leave up to the private market, that the private market is obviously going to have a place in it, but you can't assume that the market meets all of your needs. This whole argument about media is that argument, right? So maybe that's the case for social media. Maybe that's the case for search search strikes me as an incredibly powerful and potentially dangerous thing to leave purely up to the market. Maybe that's a place where we need sort of a public alternative.

Taylor Owen: And Naomi Klein goes even a step further. She argues that some of these services, these digital services have become so entrenched in our lives, so critical to our lives, that there's an argument for nationalizing them.

[CLIP – Big Tech, Season 3, Episode 11]

Naomi Klein: Increasingly during the pandemic, we've seen tech companies, like you use the pandemic as a backdoor to privatize, kind of untouchable public goods like education. I started feel like we have two options. We either throw up our hands and say, "there is no commons" right? We have, we've allowed it all to be enclosed. We have laddered all to be privatized, or we start talking about some pretty radical ideas about nationalizing, some of this infrastructure.

Taylor Owen: And I think, and on, on a note about that conversation with Naomi Klein. We ended our conversation about the climate crisis, about the existential threats of big tech in a slightly different spot. And that was a conversation about how both of our kids who are very similar ages, use technology in some really wonderful ways.

[CLIP – Big Tech, Season 3, Episode 11]

Naomi Klein: My son has very specific YouTube obsessions.

Taylor Owen: Yeah, mine too. Origami is my kids.

Naomi Klein: Origami?

Taylor Owen: Origami and magic tricks card tricks. He goes, he's deep in the rabbit hole of, of, uh, both subcultures of YouTube.

Naomi Klein: So my kid is obsessed with electric guitar tutorials, not playing it, but upgrading electric guitars. Like pickups and things like that, where you like, um, he has this one guy who he's just completely obsessed with and he starts like vibrating when he knows that he's going to be dropping a new video. It's like if Daryl takes a week off, It's just a huge incident.

Taylor Owen: I have the same thing with this random guy who does origami tutorials in the Midwest. And this guy is he does not know the influence he has in our household.

Naomi Klein: But I must say, I think it's kind of amazing. Like we just watched crappy superhero cartoons. Um, and this has made him want to actually do incredible things. Like my eight year old son now knows how to solder thanks to Daryl. So I think, um, I'm not sure I I'm thinking, I might think it's great.

Taylor Owen: Both of them get something very meaningful out of the people they follow and watch on YouTube. And we both recognize that this is something really powerful and wonderful that we can't necessarily provide to our kids alone. Which makes getting this right all the more important. If we recognize the power of these technologies and the tremendous positive impact they're having on our kids, on our lives, on our societies, on our democracies. And we want to ensure that those good things remain. Then we need to figure out how to govern them in ways that mitigate the harms. It can not be a trade off that we have to accept. That we do not have to accept all of these potential risks and harms just so we as societies can get the benefits these technologies afford, we can have both. And I think many of the people I've spoken to this year provide pathways for how we can have both for how we can govern the internet, governance or digital technologies, in a way that maximizes their benefits and minimize their harms. So I look forward to continuing these conversations next season. We have a lot planned for next year. I'm excited to talk about that and we'll be back in the fall to continue these conversations.

Big Tech is presented by the Centre for International Governance Innovation and produced by Antica productions. Please consider subscribing on apple podcasts, Spotify, or wherever you get your podcasts. We release new episodes on Thursdays every other week.

For media inquiries, usage rights or other questions please contact CIGI.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.