Ron Deibert On Resetting Our Relationship with Technology

Season 3 Episode 3
Ron Deibert on Resetting Our Relationship with Technology

The systems that underlie modern technology have gaping vulnerabilities that are being exploited by nations around the globe to maintain power and exert control.

S3E3 / December 22, 2020

BT S3E03 Guest-Headshot-1200.png

Episode Description

However you use telecommunications technology — and billions use it for everything from routine daily tasks and entertainment to seeking help, sharing confidential information or organizing civil actions — your communications are all running on decades-old network protocols with gaping vulnerabilities that can enable cybercrime and security breaches. High-risk individuals and organizations, in particular, are vulnerable, not only to surveillance but to targeted retaliation by autocratic states who use these security holes to abuse their power. But democratic countries have also exploited these weaknesses in, for example, law enforcement.

In this episode of Big Tech, Taylor Owen speaks with Ronald J. Deibert, founder and director of the Citizen Lab at the Munk School of Global Affairs & Public Policy and the author of Reset: Reclaiming the Internet for Civil Society. Citizen Lab has worked for many years monitoring communication networks for state-run surveillance. Their 2018 report Hide and Seek: Tracking NSO Group’s Pegasus Spyware to Operations in 45 Countries uncovered how mobile phone spyware has been used to target individuals, including Saudi Arabian journalist Jamal Khashoggi. Deibert believes that we need to rethink how telecommunications equipment and protocols are built, to ensure privacy and security. Until we have these safeguards, malicious actors, whether states or private individuals, will continue to hack the vulnerabilities in the communications ecosystem, leaving citizens unsafe, and civil society to suffer.

Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

Ron Deibert: I believe that we live in the golden age of surveillance, there's more data available to them than ever before. We're going through this great leap forward in policing capabilities thanks to the fact that everyone carries around with them these devices that leak constantly data and most of which is insecure.

Taylor Owen: Hi, I'm Taylor Owen, and this is Big Tech. Over the past 20 years, we've spent a lot of time thinking about the internet and the way it's shaped, well, just about everything. Nowadays, it's almost become common knowledge that social media is affecting our democracy or that smartphones are changing the way we think or that digital technologies can entrench the powerful. But 20 years ago, very few people were thinking about the internet this way, an exception was Ron Deibert. In 2001, Ron founded the Citizen Lab at the University of Toronto. While other scholars were studying the way the internet was being used, Ron focused on the actual, physical infrastructure of the internet itself; the data centres, the satellites, the fiber optic cables, and the devices in our homes and pockets. He was influenced by Harold Innis's notion of materiality, the idea that the physical properties of our communication systems actually shape the nature of our communication itself. Ron's primary focus has been on how this system is being exploited by bad actors, typically by governments trying to squash political unrest in autocracies and democracies alike. In 2001, the cyber espionage that Ron was studying would have looked like something out of a James Bond movie, it required a lot of money and sophisticated technology. But sometime in the late two thousands, all of that began to change. With the emergence of social media, we all started living more of our lives online: We were shopping online, watching TV online, and dating online. And in many ways, that made our lives more convenient, but it also made us a lot more visible and more vulnerable. All of a sudden, we were living in a world where governments didn't need to engage in spycraft to snoop on political dissidents, they could just exploit our new digital ecosystem. We often think about societies as being fundamentally democratic or autocratic, but Ron thinks this is the wrong framing. On the internet, the tools and even objectives of dictatorships and democracies are often more alike than we might want to acknowledge. All of this is front and centre in Ron's new book, Reset: Reclaiming the Internet for Civil Society, which he also delivered for this year's Massey Lectures. His book on 20 years studying the internet and summarizes the harms we now face; surveillance capitalism, the abuses of state power, addictive technologies, and the environmental costs of it all. To be honest, it's a somewhat bleak picture of the present moment. But he also has a pretty optimistic vision of the future, a future guided by some ideas from our political and philosophical past. Here is Ron Deibert. Ron Deibert, welcome to Big Tech.

Ron Deibert: Oh, thank you for having me, Taylor. It's a real pleasure to get to talk to you, especially about these topics.

Taylor Owen: Look, I've been so looking forward to this. I really love the book and it gives us this just wild glimpse into your world. I mean, I know generally what you work on but just to hear some of these stories is really remarkable. And actually, I want to start with one of those moments before... I don't want to get into all the policy stuff. But the one moment in this book that just totally took my breath away was the story you tell around Khashoggi's killing.

[CLIP]

BBC Reporter: After a fortnight of denials, Saudi Arabia has admitted that the missing journalist, Jamal Khashoggi, died during his visit to the country's consulate in Istanbul earlier this month.

SOURCE: BBC News YouTube Channel https://youtu.be/KG1JVetxPnQ

“Jamal Khashoggi case: Saudi Arabia says journalist killed in fight - BBC News”

October 20, 2018

Taylor Owen: And I wonder if we kind of start there and you can describe what happened there.

Ron Deibert: Yeah. So the way that we entered into this horrible episode goes back to Citizen Lab's research on targeted espionage. So we've had a longstanding research interest in this area, which is basically nation states hacking civil society and they do it in various ways, one of which is to contract out to private companies for very specialized and sophisticated spyware technologies. And there are a number of companies that do this. Over the years, the team at the Citizen Lab has developed really refined methods that give us visibility into some of these companies' infrastructure to the point where... With one particular company, the company in question here which is Israel-based NSO Group, we can see a lot of what's going on. We can pretty much infer who the clients are and we can see a good deal of the targeting in some circumstances. But in this particular case, in the spring of 2018, we had been monitoring targeted espionage using NSO's infrastructure and we were preparing a large report that was going to kind of profile all of what we could see basically to put a picture in front of people of, "Look, this is the scope of the type of surveillance that we're seeing, some of the clients."

Taylor Owen: And nothing to do with Saudi Arabia at the time, this was just global operations?

Ron Deibert: Global operations, although we did know that Saudi Arabia was one of their clients. When we were looking at the Saudi Arabia targeting, we could see that there was an infected device in Canada. And of course, this really stood out for us in part because we're based in Canada. But also, you'll remember around this time there was a pretty high-level diplomatic dispute between our government and Saudi Arabia. Our foreign affairs minister and prime minister were both highly critical of Saudi Arabia's human rights record and record towards women.

[CLIP]

Chrystia Freeland: We will always speak up for human rights, we will always speak up for women's rights around the world.

SOURCE: CBC News YouTube Channel https://youtu.be/L_U2yvhE5OI

“Chrystia Freeland defends Canada's stance on Saudi Arabia amid sanctions”

August 6, 2018

Ron Deibert: And Saudi Arabia had engaged in this kind of social media campaign against the government. Do you recall there is this tweet that they put out from their official account showing planes going into the CN Tower? It's just-

Taylor Owen: That's right. In response to Freeland's tweet.

Ron Deibert: Exactly. Yeah. So, looking at the targeting in Canada, we thought, "Oh, man, let's find out who this is." Which is really a shot in the dark because all we could see from our vantage point was this particular infected device. So kind of on a whim, I said, "Let's just see if we could figure out who this is." And on our shortlist was Omar Abdulaziz. We knew of him because at that time he had a pretty high profile YouTube account-

[CLIP]

Omar Abdulaziz: Foreign language

SOURCE: Omar Adbulaziz YouTube Channel https://youtu.be/8bAhy8cY7T8

October 6, 2014

Ron Deibert: ... followed by like 500,000 on a regular basis. The way I described it at the time, it was kind of like a Steven Colbert Show of the Gulf region, basically this satirical take on making fun of MBS, Mohammed bin Salman.

[CLIP]

Omar Abdulaziz: Foreign language

SOURCE: Omar Adbulaziz YouTube Channel https://youtu.be/8bAhy8cY7T8

October 6, 2014

Ron Deibert: So, we met Omar and looked through his SMS messages. It's a bit different now how NSO works, it's become much more sophisticated. But at that time, especially the mechanism, the vector by which they get spyware on your phone is by sending you a shortened link that triggers the infection and they put malware in your device and silently take it over. From our vantage point, we had pretty much mapped all of the domains that they had registered in the shortened links. So once we look through Omar's SMS messages, we saw that positive confirmation he was indeed targeted with one of these SMS messages. And we said, "Omar, do you remember clicking on this?" It was like a few weeks in the past at this point. And it turns out Omar is a guy who likes to lift weights, and so he had coincidentally ordered protein powder from Amazon on that very day and then this fake text message came in from the Saudi intelligence operators saying, "You have a DHL courier package coming, click on the link." So he clicked on the link and that's how his device was infected. And there's a kind of paradox involved in doing this sort of outreach because we assume they've hacked his phone, Saudi operatives are listening in on everything that's going on, we're meeting with him. We hoped ultimately to be able to capture the spyware as we've done in the past, but obviously, us meeting with Omar alerts them to it. And the spyware is quite sophisticated, they can actually remove it silently. So-

Taylor Owen: Once they know it's been discovered?

Ron Deibert: Yeah, anytime they can do this. So it's very stealthy and is designed in such a way to evade forensics. Anyway, we said, "Okay, we know who it is, we've got positive confirmation, let's write up this report." It wasn't until October until we publish it, October 1st. And the very next day, October 2nd, was when Jamal who Khashoggi was executed. And up until that point, I didn't realize that they were colleagues. Omar didn't mention it to me until that morning when I was in Europe at a conference and I got a text message, I'll never forget this. I looked down and it was like, "I'm really freaking out." There was a lot of... How would you put it? It was an intense day because our report was covered prominently in the global media. And of course, Saudi Arabia spying on Canada, police were calling us wanting to know who the target was and Omar was concerned about law enforcement. It was just numerous things going on simultaneously and then this comes into the mix and of course Khashoggi is missing, and as the days go by, we understand that he was executed. So it was only after the fact that I realized, Oh my God! They've been communicating over what they thought were encrypted... over an encrypted platform discussing these highly contentious plans.

Taylor Owen: And you mentioned that he chose not to go into the Saudi embassy in Ottawa right around the same time. I mean, it's just amazing how close to home that hits when you think of someone at a Canadian university possibly being murdered in an embassy in our capital. I mean, it's just-

Ron Deibert: It's unbelievable. Yeah. And I learned that because I did an interview with Omar while we were writing up the report. He didn't mention Khashoggi at the time, but he did tell me not only did they come to Canada and try to persuade him to go to the embassy in Ottawa, they also brought his brother along and held him out as like this putative threat. Like, "Okay, if you stop your YouTube stuff, come back to Saudi Arabia, we'll forgive it. You'll make lots of money, everything will be cool. If you continue it, your brother here is going to be in trouble." And of course, they went back to Saudi Arabia. His brother and other friends, maybe other family members as well, perhaps even to this day are in jail.

Taylor Owen: Geez! So, I want to talk a little bit about what this example, this very particular example, tells us about what's one of the core themes of the book, which is the abuse of power using technology. I think there's a real risk in some of these stories of thinking this is just autocratic regimes using one bad technology to do the bad things they generally we know they did. But really what you described here is a much vaster infrastructure, I mean, that is both being built in democracies and in autocracies, it involves corporations, like in this example McKinsey, that are part of these weird webs of intelligence that occurs. I wonder if you could describe a little bit how you look at that broader surveillance technology infrastructure and how it can really exist in all these different political systems for all these different purposes.

Ron Deibert: Yeah. This is something that really not only interests me, but the more I dig into it, the more I understand how precarious is the ecosystem upon which high-risk civil society, individuals, and organizations depend for the work that they do. So if you step back for a minute and you look at... putting aside social media, just the entire technological infrastructure, security is largely been an afterthought first of all. You have these legacy systems that were invented in some cases decades ago, when you look at telecommunications technologies and the protocols that underlie it all, that have kind of been cobbled together is the way that I think about it. Of course, on the surface, it kind of works well, functions actually remarkably well if you look at what we're doing right now, but there are all these negative externalities and gaping vulnerabilities. So when you insert malicious actors into the equation, they can exploit them. Under normal circumstances, that creates risks for average people of fraud, cyber crime, et cetera, but when you add into the mix highly resourced malicious actors being serviced by dark PR firms, well-equipped surveillance vendors who know how to work their ways through the labyrinths, the catacombs of this ecosystem, it really is a disaster. And that's why one of the themes of the book that I tried to get across, and in fact in all of the work that we do, is to say that there is a real crisis in global civil society right now because of the flawed infrastructure upon which they rely. And we're seeing it like daily. I mean, the number of cases are just mounting of abuse of power happening worldwide. And it's certainly one of the most serious crises of liberal democracy in my opinion, is this hollowing out, neutralizing of civil society. And it's not just the tangible cases, like the worst being of course murdering somebody or gathering incriminating evidence and putting them in jail, it's the fear. That's the big thing that I notice, a psychological consequence. Spending a lot of time with high risk people all over the world, you see now everyone is afraid to communicate, they don't trust the technology. That's like throwing sand in the machinery of civil society, and it's really insidious, right? It's kind of, "Oh, maybe somebody is watching me and I better not respond to that email." Everything slows down. Yeah, it's a big, big, big problem.

Taylor Owen: It feels like with this book particularly, you want to go far beyond just the high risk individuals. And even just in that, in the descriptions of the abuses of power, the range is pretty broad, everything from Saudi Arabia's murdering of journalists to Uyghur detainment in China, to Amazon Ring on our doorsteps, and police access to that data. And liberalism also runs through this conversation pretty clearly. Are you worried that these technologies are just fundamentally illiberal that we're building, regardless of whether they're being built in democracies or autocracies? Are we building an illiberal technology infrastructure?

Ron Deibert: I don't know if I'd say technologies have particular ideologies attached to them. I think what happens is we create something, it's designed for a specific purpose, and then it ends up having consequences that we didn't expect. When I was adding up what I call the painful truths around the internet and social media, to me, it's almost like interlocking gears in a giant machine, starting with surveillance capitalism and all the dynamics about that, leading to the toxic public sphere or the outcomes that we see for the public sphere, and sensational, extreme discourse being prioritized and pushed out, and how that creates opportunities for dark PR and propaganda and disinformation to flourish. Everybody is aware of that now, I think. But also the insecurity. All of us carry around these things now with us all the time, in our pockets, and they're designed in such a way that inherently they're insecure. It's almost as if that's intentional, because you have at root this business model which is about applying more and more sensors into more and more applications and devices to find out more and more about what you do, and those who can exploit it end up being advantaged, given an advantage. In my view, one of the biggest overlooked consequences of the digital revolution let's call it is the way in which it is creating kind of states on steroids, and in particular, this type of what I call super power policing. So, I'm thinking in particular about local law enforcement, mostly in western industrialized countries, who now are being equipped with all sorts of products and services that enable them to do all of these things at the same time that we are, thanks to surveillance capitalism, turning our digital lives inside out. So it's like the capabilities of law enforcement are increasing exponentially, but safeguards around them have remained more or less steady, and in fact, I would argue have been gradually eroding. So we have this accountability gap. That to me is not really... People are specialists, understand that, but the general public I don't think is grasping what this means for the architecture of liberal democracy. The constraints and safeguards around the abuse of power, which are, in my opinion, the heart of liberal democracy cannot be taken for granted. They are social constructs, they're institutions, and they can be easily ignored, sidestepped, or eroded over time.

Taylor Owen: On the liberal democracy piece, I was surprised to a certain degree that you didn't have in one of the harms chapters democracy itself, or democratic integrity. Did you just think that runs through the entire project in the sense, that this is about democracy? Or is this...

Ron Deibert: I do see it running through, but especially pronounced in the chapter on the abuse of power, which I ended by trying to remind readers that abuse of power is inherent to human societies unfortunately, human nature being what it is. And those who have studied abuse of power have remarked upon this, that this is something... it's a natural tendency. People who have power will tend to abuse it and we need to create safeguards against it. What I am trying to convey throughout this book is you have these interlocking mechanisms that are combining to erode liberal democracy which itself historically is both rare and fragile. You're a political scientist by training as well, and as am I, and this is something that I think technologists don't always appreciate, is that this is not something that has descended upon us and will be here as a fixture forever. It's something that is a human creation and it can easily be subverted. And it also depends on the material contexts. As I was researching, especially republicanism, classical republicanism, and people like Montesquieu, right? This understanding that geography, technology, material context, really matters for whether liberal polities will be viable. Well, we're going through this phenomenal transformation in the material context within about a span of a decade really, more or less.

Taylor Owen: I mean, that's what kind of struck me, is that you begin the book talking about materiality and particularly Harold Innis and how he sort of puts all of this weight in the infrastructure itself and how that drove you to try and understand the infrastructure. But that was 20 years ago. Right? Now infrastructure has gone through this just tremendous evolution since then and it feels like you're talking about it in a different way now. And even just the use of social media as kind of this overarching framework for a lot of this discussion feels like that's something new, right? We have a different infrastructure than we did 10 years ago, as you said. Do you think that's right?

Ron Deibert: Kind of, although one thing I'm struck by, and I think I mentioned I want to start focusing on this moving forward, are the legacy systems that underlie it and that won't disappear. For example, telecommunications. While we're all focused on social media right now for a good reason, there are SMS messages going around threatening people. And one of our researchers at the lab is doggedly tracking these down. And part of the reason something like that can happen is because telecommunications networks were designed at a time when most of the companies were largely state run and there were very few of them. So they had these kind of protocols that they developed and technological infrastructure that more or less was a reflection of that arrangement. And it hasn't gone away, it's just kind of persisted. NSO Group, the spyware company that we're tracking, has now developed a no-click exploit. So they don't need to send that SMS message with the shortened link is bait any longer, they can simply ring up a phone and take it over, any phone in the world. The reason that they can do that is because underlying the communications' ecosystem is this inherently flawed protocol system that goes back to the 1970s, if not earlier, that hasn't been fixed and really cannot be fixed. It was built for a time prior to everyone having instant messaging applications on mobile devices.

Taylor Owen: And so is this social media conversation distracting us a bit from this broader infrastructure governance conversation we need to be having? Are we too quickly abandoning some of these... the more long-lasting infrastructure debates?

Ron Deibert: Well, it's hard to avoid the social media topic because it's so prevalent and dominates, it's all around us, and I think they're connected in a way too. People like Shoshana Zuboff and others have done a terrific job really underlining for us the dynamics of the business model of surveillance capitalism and social media that I think are at the heart of a lot of the problems. But there is this underlying ecosystem of inherent insecurity, is the way that I think about it, that compliments those dynamics in important ways. And so we can't solve one without looking at the other in important respects.

Taylor Owen: Now, the discourse in this space, especially in the mainstream media, typically revolves around the western world and Silicon Valley tech companies, but there is another just as influential player here, China. For a long time, people thought that the internet, like capitalism, would be a democratizing force in China. That hasn't happened. And China is actually exporting their own surveillance technologies to other illiberal regimes. It's a big part of the current slide towards the liberalism that we're seeing around the world. How do you engage with that narrative? What's China's ambition here, because we've been told for decades that there was no imperial ambition?

Ron Deibert: Yeah. It's a big and important topic for sure and a really disturbing one too. And going back when I first got into this area, after the Berlin Wall fell, I was actually in East Berlin and I was training at that time to be a Sovietologists. That's where I thought my career... And when the Berlin Wall fell, I was starting a PhD program at the University of British Columbia and my supervisor, Mark Zachary, said, "You should look at the information revolution."

Taylor Owen: Little did he know the two would re-emerge 20 years later, right?

Ron Deibert: It's true. Yeah, good point. So I was like, "Okay. Yeah, that's great. I'm going to look at this." I kind of bought into the belief at that time that there is no way... I was looking at it kind of structurally. How can these rigid, monolithic dictatorships and authoritarian countries withstand this lightspeed, distributed infrastructure? It will be impossible and there's all sorts of ingenuity and ways that it's empowering civil society. And so, actually when I then was hired at University of Toronto and started the Citizen Lab, the idea was to kind of test that proposition, and China was our main focus originally. And what I began to realize was that, okay, this is not such a simple equation, they are developing the great firewall and filtering technologies. It was only more recently that I began to see, okay, wow, this is really now developing into something where you have state surveillance capabilities, but also very lucrative business opportunities that are combining in this kind of dystopian mix. And I think it gives a glimpse of what things look like when you remove altogether any institutionalized checks and balances against the abuse of power. The system is not perfect. One of the findings of our research at Citizen Lab is that in fact it's highly distributed, censorship and information control generally, passed down to the private sector and it makes people a lot of money and business is the key part of what's driving a lot of the ambition to export a lot of these technologies. So the combination of all of that explains why we're seeing this incredible spread of Chinese technology, especially in places like Sub-Saharan Africa. And I think there is obviously layered over top a geopolitical kind of strategy around, "Let's align a lot of these countries with our interests to make sure that we can count on them when it comes to things like votes at the United Nations as well.

Taylor Owen: One of the things that often frustrates me about the policy debate in this space is how fragmented it is. Because big tech influences so many aspects of our society, the policy agenda, everything from privacy law to trust busting, can seem vast and disconnected. Ron thinks that we need some sort of guiding principle to shape this agenda, but he doesn't think we need to reinvent the wheel. Instead, he draws on one of the oldest political traditions around, liberalism. In particular, the liberal idea that we need to build restraint into our political and economic systems. For Ron, restraint is the key to this whole thing. So how did you come to this guiding framework and how would you describe it, of not just restraint, but how it's situated within a much deeper political tradition? And why is that important do you think?

Ron Deibert: Like you, I sort of look at this and part of it is a frustration that you see all of these proposals and it's kind of like there's an incomplete foundation to it. And also of course, just friends that I speak with, people who are not academics but still very intelligent, thoughtful people will all always ask, "Well, what should we do about Facebook? Should I give up? Should I unplug?" I get these questions all the time and I'm like, "Oh, how do I answer this?

Taylor Owen: It's a daily challenge.

Ron Deibert: Daily challenge. And so you must know just because vaguely I do stuff technology related. I got a question for you, like-

Taylor Owen: Solve Facebook!

Ron Deibert: ... Facebook. What should I do? So, what I wanted to do here, especially given the format of the Massey Lecture, which is meant to be for a popular audience, right? Was I knew that I didn't want to put forward a series of recommendations, right? Like, "Okay, here's recommendation number 22." Right? It would turn people off. But more importantly, I wanted to remind people. I felt like there was a kind of amnesia that what people are advocating for when they're talking about antitrust, what they're advocating for when they're talking about algorithmic accountability or the right to repair or even unplugging and detaching. To me, my background is in political theory. I'm not a political theory theorist, I wouldn't pretend to be, but I have spent a lot of time reading the classics of political theory. So it's not like I'm inventing something new, I merely wanted to remind people that, A, there is a underlying philosophical framework that we can apply here. B, it's not something new, it's something that humans have thought about for centuries and it's helped them navigate particular challenges especially when the material context is changing or when they're thinking about how to prevent dictatorship from emerging or ensuring that there's equity and fairness when it comes to people's rights. And this is a tradition that goes back to ancient Greece. The more I thought about it, the more it kind of all came together and made sense. But especially with the liberal Republican tradition and its central importance in the founding of the United States, I knew that this would be quite provocative at this time to actually put that out there and say, "The most fulsome articulation of these principles was done by this group of people at this time." No doubt about it, there are all sorts of flaws around the characters and the circumstances, but at the heart was this experiment and wouldn't it be nice to remind people about that? So anyway, that's how that came about, is simply, okay, there's a lot that we can work with here, both in terms of... Restraint is at the heart of republicanism as I see it and also though this idea of civic virtue, which I think is often overlooked. We have this reflex right now to blame social media, to expect them to police content and discussion in the public sphere when in fact there is an underlying obligation, I would say, among users, consumers, and citizens ultimately to think about their own behaviour. And that's something that just doesn't come out of nowhere, it's something that has to be cultivated through public education. And I think it's one of those things we've kind of lost sight of for a variety of reasons over the last... really the last century, but especially in the last few decades.

Taylor Owen: Is there a challenge of placing this responsibility in liberal institutions at a time when trust in them is in decline in part because of the technological infrastructure that politics is being done on and in?

Ron Deibert: Yeah, no doubt about it. I mean, we're witnessing it as we speak and there's been so much corrosion on multiple levels around those public institutions. And also a variety of interlocking causal factors is how I think about it, right? You have neo-liberalism and deregulation and all of that entails, you have this trend which is connected to it kind of tangentially to emphasize within public education and universities, engineering, science, mathematics, at the expense of the arts, humanities, social science, civics, and then of course the primary means by which we communicate and exchange information is fundamentally premised on a business model that intentionally surfaces extreme sensational content. Those combined create the outcome that we see here. It's like the exhaust from the machine that I'm talking about is all of the negative externalities that we see on a daily basis.

Taylor Owen: So I wanted to ask you about a couple of the maybe more challenging aspects of this policy agenda. I mean, as everything gets clustered together in all these ideas, one of my other frustrations is that we treat them all as kind of equally solvable, when really when you look at that menu of policies you laid out, which includes dozens of things, some of them are probably quite easy and can be done with a degree of political backing and probably should be done tomorrow. And others are these just fundamentally vexing issues that could take decades or could never be resolved and maybe the harms just need to be minimized. I wanted to talk about a few of those. One is on encryption. So, you have argued for a long time very powerfully around the value of encryption and the critical importance of encryption for both preserving the integrity of our infrastructure, but also for individual rights, preserving individual rights. And we seem to be a place now where a lot of our communication moves to more encrypted spaces and potentially more at least private spaces that we're going to get increasing government pressure to look into those spaces. How do you see that debate playing out when on the one hand you have people saying, "Look, WhatsApp is a real problem," and on the other hand, you have a lot of very important political activity that happens within it?

Ron Deibert: Yes. Those who come forward and say, "I've got a simple answer for this," are really, I think, leading a lot, because there is no simple answer, it's a complicated issue. There are equities involved and there are real challenges for law enforcement and intelligence agencies to do the work that they do. I know this from our own experiences because we are doing work that's very similar in nature and we often come across instances where we are stymied by things like that. It prevents us from seeing who's doing something nefarious in the same way that law enforcement would have trouble dealing with very serious criminal issues like pedophiles and so on, terrorism. My view is that, first of all, the debate is often incorrectly portrayed as between security versus privacy when in fact it's about two different versions of security first of all. And we cannot sacrifice one for the other. Instead, I think we have to recognize that law enforcement needs to pivot and state agencies need to pivot. And by that, I mean they need to change their orientation and how they go about doing what they're doing. And I think furthermore, this idea that you often hear, they use the rhetoric of going dark because of encryption, I believe that we live contrarily in the golden age of surveillance. There's more data available to them than ever before. As I explained in the book, we're going through this great leap forward in policing capabilities thanks to the fact that everyone carries around with them these devices that leak constantly data and most of which is insecure. So it's a bit of a rube really to say that this encryption is kind of preventing us. What they're really saying is, "We want even additional benefits that we don't have right now and we are willing to weaken everybody else's security in order to do that. Part of the reason I can say this confidently is because of the work that we do at the Citizen Lab. If we are able to track the most powerful nation state operators and the world's most sophisticated surveillance companies to the point where I can literally say to you with a high degree of confidence right now who's using that spyware, what countries and in which countries there are infected devices taking place... And we're a small research group at the University of Toronto. We're not some super power agency. That suggests to me that there is a lot that can be done if you-

Taylor Owen: The NSA might have as much enough visibility.

Ron Deibert: I'm sure they've got this all under control, which is in part why you often find the NSA not advocating for weakening encryption, in fact, the opposite. They understand that. We are now I think dependent on a planetary network, is the way I think about it. We're moving in this direction in spite of ourselves. And of course it's a long process, but it's a one-way street. If we're going to survive the challenges of climate change, we're going to need a highly secure infrastructure through which we exchange information, store it securely, and so on. Encryption is the only way to do that. So when it comes to criminal behaviour, challenges around national security, we have to think of alternative ways to deal with those problems.

Taylor Owen: So, one thing I was really pleasantly surprised to see is this focus on the environment and climate change which you almost never see in this kind of book, talking about the harms of potential harms of technology. But I found it striking that there wasn't a big conversation about the solutions to those sets of problems in the governance piece. And I'm wondering if that's because they lie elsewhere, that if these are just... Is there anything about these that are technology-based or are these just problems with our supply chains and capitalism and lack of global regulation on the environment? Should we be thinking about these as a technology specific issue?

Ron Deibert: I think they're connected to technology, and especially the syndrome that we see around the environmental impacts, the ecological footprint of... or at least that I see and some others see around big data and the machine-based civilization that we've created, are very much connected to the current business model, which has been for a long time around consumption and planned obsolescence. That's not new, right? But with surveillance capitalism, it's like it's been amplified 100 fold. So to the extent that we can solve some of the problems around surveillance capitalism will contribute, I believe, to some of the solutions that are required around the environmental impacts of technology and big data. Hopefully we'll slow things down a bit. And it's connected also to things like the right to repair. So if you look at the conversations, most of them around the right to repair, they actually don't talk about it in relation to waste. It has to do more with, "Hey, I bought this thing, I should be able to fix it." And also proprietary issues, digital rights management. To me, the right to repair is as much about planned obsolescence and working against that. I really saw that with the trip that I talk about in India. I went there specifically knowing I wanted to write about this topic. I was doing other things as well. But I made some trips to the recycling centres and it was remarkable to me this kind of industrial effort to take things apart right down to the last screw. There were a lot of problems with what's going on there, but there are also some lessons to be learned about how we think about this stuff. And all of it connects back to this idea of, "We don't want people looking too much under the hood." This is pervasive.

Taylor Owen: And it's not just with hardware, right? It's with algorithms and with what data is collected and what's not and how it's used and how it's sold. Right?

Ron Deibert: And institutions. Hey, don't look at-

Taylor Owen: Absolutely. Yeah.

Ron Deibert: So all of this has to... it connects back. Once you start talking about, Well, you should give people the ability to actually fix their equipment, then it opens up the possibility of more curiosity about what's going on inside those systems, where all the technology leads. So you're not presented instead with this virtual mirage. What we're doing right now seems like magic, but it's not magic, it's very physical, right? As we are speaking, somewhere in... I don't know, between you and I, there's a data centre that's chugging along and it's drawing enormous energy.

Taylor Owen: And water.

Ron Deibert: And water, exactly, and all of this stuff. This thing is... I just felt like I bought it like last year. I probably got to upgrade now to another one and the battery is dying. Right? There's a culture around more, faster, that I think we need to revisit and it's all part of one piece.

Taylor Owen: And that perception of magic empowers the people who control the systems.

Ron Deibert: Totally.

Taylor Owen: I mean, that's why we see all of this glorifying of tech workers and of government powers that use these technologies we don't understand.

Ron Deibert: Exactly. And techno-solutionism. We have an app for that. Trust us, we know how to do this. It's true. But you are right, I could have... one of the... A bit frustrating for me was once I got into that last chapter, I realized, okay, I can only be superficial around a lot of these things. There's not enough space, otherwise I'm going to have one big mammoth chapter. And I knew these were lectures, they had to be five lectures. So I am planning, I'm working on it as we speak, a subsequent book on this very topic, more or less. Driving a lot of what I do is a recognition of the existential risk around climate change mostly. And so technology is critical to solving that and we right now have this dysfunctional system that is actually making things worse.

Taylor Owen: Oh, God, I'm glad to hear you're working on that. I'd love to hear more about that. I have five other vexing challenges to talk to you about, but we obviously don't have time. So I'll just end it with one final piece, I guess, which is, how do you see the public views on this changing? Do you think we're in a different moment now and there's going to be more appetite for this kind of pretty big regulatory reform we're talking about and changes to how we govern? Are you seeing a public appetite for that in your work?

Ron Deibert: One that I haven't seen in a long time for sure. It's counterbalanced by all sorts of other things, especially the pandemic. I'm really concerned about the ways in which big tech and the existing infrastructure, which is highly insecure, poorly regulated, invasive by design, and prone to abuse-

Taylor Owen: And we're all living on it 24 hours a day.

Ron Deibert: We're just so more dependent on it. I'm worried about that. But at the same time, as never before, I think there is a moment where you have these conversations. Even the fact that the idiot Donald Trump is tweeting about Section 230, it opens up a conversation, "Okay. Well, what is Section 230?" Most people don't know. Right? Of course, you and I know, but general public now, to have them talking about...

Taylor Owen: Isn't it wild to see sort of multiple, hours-long YouTube shows debating Section 230?

Ron Deibert: Totally.

Taylor Owen: It's remarkable.

Ron Deibert: And as you said and you obviously recognize, earlier you said that it's incredibly complicated to look at any one of these. And there is one, right? Like, "What the heck?" You start opening it up, it's like a Pandora's box. Right? And we have to kind of tip toe around it. It's also US regulation. What do we as Canadians do? How do we think about this whole topic? I'd love to talk to you about that because I know... I'm a Canadian, but honestly I'm not an expert in the Canadian system and I would love to talk to you more about how you see this all unfolding in Canada. What levers do we have? What's realistic? What could we do in this country differently?

Taylor Owen: Well, to be continued, and we'll take that one offline as they say. All right. Well, thank you so much for doing this. I really appreciate it.

Ron Deibert: Thank you so much, Taylor. And thank you for having me on your show.

Taylor Owen: That was my conversation with Ron Deibert.

Big Tech is presented by the Centre for International Governance Innovation and produced by Antica Productions.

Please consider subscribing on Apple Podcasts, Spotify, or wherever you get your podcasts. We release new episodes on Thursdays every other week.

For media inquiries, usage rights or other questions please contact CIGI.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.