Sasha Havlicek On Mitigating the Spread of Online Extremism

Season 1 Episode 10
Sasha Havlicek on Mitigating the Spread of Online Extremism

S1E10 / March 12, 2020

3-Ep10_SashaH_Headshot.png

Companies, international organizations and government agencies are all working to identify and eliminate online hate speech and extremism. It’s a game of cat and mouse: as regulators develop effective tools and new policies, the extremists adapt their approaches to continue their efforts.

In this episode of Big Tech, Taylor Owen speaks with Sasha Havlicek, founding CEO of the Institute for Strategic Dialogue, about her organization and how it is helping to eliminate online extremism and hate speech.

Several issues make Havlicek’s work difficult. The first challenge is context: regional, cultural and religious traditions play a factor in defining what is and what is not extremist content. Second, there isn’t a global norm about online extremism to reference. Third, jurisdictions present hurdles; who is responsible for deciding on norms and setting rules? And finally, keeping up with evolving technology and tactics is a never-ending battle. As online tools become more effective in identifying and removing online extremism and hate speech, extremist groups find ways to circumvent the systems.

These problems are amplified by engagement-driven algorithms. While the internet enables individuals to choose how and where they consume content, platforms exploit users’ preferences to keep them engaged. “The algorithms are designed to find ways to hold your attention, … that by feeding you slightly more titillating variants of whatever it is that you're looking for, you are going to be there longer. And so that drive towards more sensationalist content is I think a real one,” Havlicek says. These algorithms contribute to the creation of echo chambers, which are highly effective tools for converting users to extremists.

Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

 

Sasha Havlicek: We forget with extremist movements that violence is just one means to an end. And the end, the purpose, is essentially social and political change, and that end can be pursued through political and social means as well. And we see a lot of energy and a lot of effort being spent in the extreme rights space in terms of getting some of the most rabid racist concepts and conspiracy theories to be normalized as part of public discourse and political discourse.

[MUSIC]

Taylor Owen: Hi, welcome to the Big Tech Podcast. I'm Taylor Owen. Again, I'm solo this week. This is going to be the second of our episodes that I recorded from Dublin, Ireland, where I was attending the International Grand Committee on Disinformation and Fake News. As I explained last episode, it is a committee of parliamentarians from around the world who are looking at how they can better coordinate their efforts to govern the internet. And one of the thorniest issues when we talk about platform or internet governance is hate speech and really the range of potentially harmful speech, ranging from just rude behaviour right through to the incitement of violent extremism that is enabled through the way we communicate digitally. And the challenge we face is that some of this speech that is enabled bumps up against our current way we regulate speech in society, which was really designed for a physical world. At its core there is an irreconcilable tension. Governments need to choose whether they're going to prioritize the protection of citizens ability to speak their freedom of speech or to protect the rights of those that are endangered or harmed by that speech. And these are in tension with one another. There are very few people in the world better positioned to talk about that tension and that governance challenge than Sasha Havlicek. She's the CEO of the Institute for Strategic Dialogue, which is a London based think tank that works on countering violent extremism, mostly in the digital space, but she spent her career working on much broader spaces of conflict resolution, the incitement to extremist violence globally, and increasingly on digital operations in elections, doing some really pioneering work on monitoring and measuring harmful speech in election periods. So we're lucky to have the chance to speak to Sasha today and here is my conversation with her.

[MUSIC]

Taylor Owen: Hi Sasha, welcome to the Big Tech Podcast.

Sasha Havlicek: Thank you so much for having me, Taylor.

Taylor Owen: Thanks for doing this. We are here at a conference on tech regulation and a big piece of that is this challenge of regulating harmful, or hate, or extremist content on the internet and on platforms right now. It's one dimension of this much bigger range of issues, but it seems to be a core one and it's one that's in the news daily now. And you've been working in this space for a long time, so I wonder just to some framing, how do you define extremist, or hate speech, or harmful speech online, when that comes up, what are you talking about?

Sasha Havlicek: We have a very specific definition of extremism at the Institute that we use, but I'm not sure that that's really what you're asking. What you're interested in is how do we go about finding this stuff online and what is the response that we need to have?

Taylor Owen: Well, finding implies defining, right?

Sasha Havlicek: Hate speech I think is quite clearly defined. And you're talking really about abuse directed at a group based on protected characteristics. And as you know, of course there is illegal hate speech that is addressed by law, but often not particularly well in practice. And so we see this proliferation of illegal hate speech, but also of the the tail of that, which is an enormous amount of hateful, abusive, typed language, slurs and so on.

Taylor Owen: So, not necessarily illegal, but harmful in some way to the discourse.

Sasha Havlicek: Correct. That's nasty, that can be harmful, but that isn't classified as illegal. Now we look at extremist content and we define extremism as a system of belief that posits the superiority and dominance of one in-group over all other out-groups, a de facto antithetical to the application of universal human rights. So extremist groups that proliferate these ideas of supremacy.

Taylor Owen: And that's core to it, is the supremacy element of it?

Sasha Havlicek: Correct. It is antithetical to any form of pluralism and it is a call to action. It is the advocacy of systemic change in order to reflect that worldview of supremacy of that in-group. That dynamic of in-group/out-group isn't just a belief, but it is propagated with a view to changing the system.

Taylor Owen: Evangelism to it, almost, via the speech act?

Sasha Havlicek: Correct. Exactly. Again, we've been looking at the propaganda machineries of extremist groups across the ideological spectrum.

Taylor Owen: Can you give us a range of what's that band of actors? The groups?

Sasha Havlicek: It's very, very big, but we know we came to this looking at extreme right and white supremacist and Islamist extremist content. But of course you find extremist content on the far left of the political spectrum. You find it in the context of certain religious context. So you find, of course extremism is expressed in many, many different forms within many different geographic and communal environments. So nothing really new there. What's new is the rate, the scale, at which those types of ideas, and those movements of course, can spread those ideas now in the digital era. And we saw in a way with ISIS, I think, the single most effective global marketing phenomenon of our time. And so within a year you saw ISIS really becoming the global phenomenon that it was, and it was a digital phenomenon on mainstream platforms. What we've seen since is a massive clamping down on specifically the Islamist extremists space.

Taylor Owen: And that was a government clamp down, right?

Sasha Havlicek: Yes. And you saw governments in the West, but really then mobilized governments around the world in order to make this a top priority. Put a lot of pressure on the companies essentially to self-regulate and to remove this content. That's where those big conversations started around pushing the companies to do content moderation and removal at scale and to do that through AI.

Taylor Owen: At the time they were partners on that to a certain degree. The platforms were, they were being forced, they were doing it under the threat of regulation, but they were engaged in that.

Sasha Havlicek: Well, I think at the outset there was quite a bit of pushback around the request for automation of removal.

Taylor Owen: They didn't want to do that?

Sasha Havlicek: Well, there was quite a few months, maybe even a year, of discussions around how that simply wouldn't be possible. And there was quite a bit of broader pushback. But very quickly we've gotten to a place where, I can't remember exactly what the percentage is, but it's 99-ish percent of that type of terrorist or violent extremist content now being removed, not through flagging, not through government flagging systems, or civil society flagging systems, but through the automated systems of the companies themselves.

Taylor Owen: Is there anything particular about that content that makes it easier for automated systems to identify rather than this other stuff we're talking about now?

Sasha Havlicek: Yeah. Well, I think in reality there's much more nuance and much more gray to terrorist and violent extremist content than people would like to really address.

Taylor Owen: It's not all violent acts being circulated.

Sasha Havlicek: Correct. So whereas you get the sharp tip of the spectrum, the violent and branded content, in many ways the most effective propaganda is the less violent, the nonviolent, and often non-branded content that we've seen. And there's much of that that was extremely effective because after all these are movements that are inviting people to come in based on a sense of common purpose, camaraderie, brotherhood, sisterhood, nurturing, and so and so forth. It isn't just violence. And so there's a very, very broad spectrum of terrorist supporting content out there, and indeed some of that still goes unaddressed. The challenge with the far right is that it is a network movement that doesn't set on the UN prescribed terrorist list and so it lends itself less easily to the measures that have been taken around Islamist content. And as a result it is much more available. This type of content is much more available. The tragedy in Christchurch has, I think, brought home the need for a major policy and operational response to the proliferation of white supremacist and far right content.

Taylor Owen: So, do you see a continuum then, in the types of tactics and tools that were being used in these more extreme movements to what's now being almost mainstreamed into our politics now?

Sasha Havlicek: Irrespective of the differences in the ideological frame, essentially, they do more or less the same thing, which is they speak to different types of audiences in fairly targeted ways in order to bring people progressively more and more on site. And they do a lot of work to undermine opponents and to de-legitimize opponents. So this of course is common across the board. They are more or less present on different types of platforms. Their information operations take slightly different shapes and forms. We see increasingly around the far right, the development of alt-media network, which is extremely active and dynamic, much more public facing, much less hidden in many ways than the Islamist space.

Taylor Owen: In fact, it's the opposite of. They're trying to be as seen as possible, right?

Sasha Havlicek: Yeah, and the objective there really is around mainstreaming. So I think they know how to skate that. They understand the line between legality and illegality.

Taylor Owen: Acceptability and unacceptability. Where that threshold is.

Sasha Havlicek: Exactly.

Taylor Owen: What's the difference between mainstreaming and radicalizing?

Sasha Havlicek: Increasingly we forget, I think, often with extremist movements that violence is just one means to an end. And the end, the purpose, is essentially social and political change. And that end can be pursued through political and social means as well. And we see a lot of energy and a lot of effort being spent in the extreme right space in terms of getting some of actually the most rabid racist concepts and conspiracy theories to be normalized as part of public discourse and political discourse, and achieving that quite effectively.

Taylor Owen: So moving to what's happening now and you work both on these more extreme alt-right communities certain radicalization efforts, but also just on broader disinformation campaigns. What we'd broadly put in this bucket of either foreign interference, or manipulation, or media manipulation in various ways online. One of the things, we just had this election in Canada. And you were working on it. And we were working on it to a certain degree. And I don't think we saw this acute problem in the way that I think it probably has existed in European countries. And I don't-

Sasha Havlicek: Or that we saw in 2016.

Taylor Owen: Exactly. So from 2016 on, eight or so elections, a real pattern of the types of things you're talking about. I'm not sure we really saw that here, but what we did see was this real degrading of the discourse. We saw it very siloed, very polarized, more radical. Every group was a bit more extreme. And I'm wondering how you make sense of that. What's happening in our ecosystem?

Sasha Havlicek: So, in the disinformation space, we've seen a real evolution since 2016. We've been looking at malign information operations and how they're organized, what the lead actors, the targets, and the tactics. So in 2016, and just after, everybody of course was alive to this Russian threat and looking for it. And ever since, I think everybody's been looking specifically for that type of-

Taylor Owen: A very particular type of thing.

Sasha Havlicek: Correct. And in fact most government agencies, international agencies, that have been set up to look for this kind of information manipulation in electoral context, are looking for state actor, foreign state actor, interference. In reality the actors are much, much more complex than that. And you see an intersection of foreign state and non-state actors, transnational non-state actors. We saw for instance in the German elections an interplay, a reinforcing of the type of content that was being put out by the international alt-right and Kremlin sources for instance. But increasingly what we've really seen is a whole spectrum of transnational non-state actors, special interests from the US, from other countries, specific religious groups, interplay with domestic extreme groups and domestic political actors in this space. And so that bleed becomes a very, very challenging thing for governments to respond to as well as for the companies really to take on. The other thing, I think important to say, we've seen an evolution away from the most obvious fakery. Fakery of content. Fake news, this idea.

Taylor Owen: Clear websites that are publishing fake information and distributing them.

Sasha Havlicek: Exactly. And fakery of distribution. So the most obvious.

Taylor Owen: Inauthentic bot activity.

Sasha Havlicek: Inauthentic bots. Which is not to say that we haven't found that type of activity. We do find it, we've found that quite systematically across these elections, but they're much smaller scale and the overall impact of that I think will be fairly limited.

Taylor Owen: One of the things that happened in Canada and the election is there was speculation that there were American Trump supporters infiltrating Canadian hashtags because they looked like coordinated activity and they were all using the MAGA tags in their profiles. When we looked at it more closely, it actually looked like it was just-

Sasha Havlicek: People.

Taylor Owen: People. So they were real people. They were Canadian. They were conservatives. Obviously they were Trump supporters. They were using the MAGA label as a signifier of an ideology, probably. Not as anything other than just saying they broadly agree with that movement. And they were just behaving. They were speaking a lot regularly in public on the platform in the election. Yet they were all behaving in a very similar way, using the same kind of language pulled from other countries. So that's just a degradation of the discourse or something. What is that?

Sasha Havlicek: Correct. I think there's also the stoking of organic extremism and if you invest in that and stoke those networks and build that up over time, it is very effective. But increasingly I think we need to be wary of the fact that state actors can in fact also, by networks of people.

Taylor Owen: People, yeah.

Sasha Havlicek: Real people, to do certain types of things. And as we look at a widening array of state actors involved in this space, we need, I think, to be constantly understanding that these are small costs for governments and it may be much easier for them to do in fact, than doing the inorganic inauthentic activity that can be spotted now. So there is this blend of things happening and it's very problematic because in terms of responsiveness, you see governments in a position to do something about clearly defined foreign state activity. But if they're seen to be snooping on their own citizens who are expressing a political opinion, it becomes genuinely problematic. And we've seen outcries in relation to that type of activity of government in a number of countries.

Taylor Owen: As we start talking about solutions to this and the policy framework in which this is being discussed, I feel there's this desire for solutions to what might be irreconcilable tensions in this conversation. In that I think there is potentially a divide or a tension between protecting people from harmful speech online and protecting people from being able to speak absolutely online. Those just might be in irreconcilable tension to each other. Which ultimately means this is a political conversation. This is going to require citizens and people to make a political choice about what kind of speech they want in this digital space. Is that a fair characterization?

Sasha Havlicek: There's an element of that, but if we reduce this to a conversation about, in terms of the online world, about free speech versus censorship, essentially, the removal of hate speech, I think we're not addressing the fundamental challenge. In reality, the real harm, in my mind, in relation to what's happening on the platforms today around hate speech and polarization is derived from the technological architecture of these platforms and the way in which that inorganically amplifies extreme messaging and content, and ultimately drives people into spaces that they otherwise may not have been driven into and holds them there without them understanding really fully and transparently why, and the ability of those spaces, those cultural ecosystems, to impact people's world-views is extraordinarily strong, I think. Whereas a piece of speech here or there may not in fact do the same thing. And so we-

Taylor Owen: So, by reducing it to an individual act, we're losing sight of the structure.

Sasha Havlicek: Correct. In an ideal world, I would say, you out-compete bad ideas of course with good ideas. And in an ideal world we would have a level playing field of speech in the digital world. These platforms would be genuinely neutral and you would be able to then educate and mobilize civil society responses, for instance, in more innovative ways. Stuff that we've tried to do at the Institute from the outset. We saw these problems and thought, why are we allowing this gap to develop between the activities of bad actors and those that would legitimately seek to undermine those ideas and propose alternative ideas. We must be doing this. And so I'm all for competing. But you can't compete, of course, effectively if the playing field is tilted. And it is tilted because of this technological architecture. And so until we shift the conversation beyond the content moderation and removal conversation, which has been by and large the focus of government pressure and policy, and indeed now regulation to date, until we move beyond content moderation and removal and start to address these structural systemic issues, we won't address the fundamental problem.

Taylor Owen: So, I would argue that it's not just the regulatory bodies that are focusing on content moderation. It's also the companies themselves because they're about individual acts of speech and that's their unit that they want to prioritize, is individual agency on their platforms. So if you're talking about individual pieces of content, then that's getting away from this structural conversation, which they don't want to have.

Sasha Havlicek: No, they don't want to really have that. It's a harder conversation to have because it represents a harder set of solutions, ultimately. We've pushed very hard for an approach by policymakers towards regulation that would really prioritize transparency and, again, in relation to that dichotomy, which I think is a false one between free speech and censorship around which you've seen civil society actors come in and, I think, to an unhelpful conversation.

Taylor Owen: And the digital activists in particular are in real conflict over that, right? Organizations sitting on both sides of that.

Sasha Havlicek: Correct. Whereas actually I think the conversation really needs to be about redressing the imbalance in this space and ensuring that we do in fact have a space in which free speech can happen.

Taylor Owen: What does that look like? What are the core leavers that you think, or the core incentives, that are creating the imbalance in the way speech flows in the systems?

Sasha Havlicek: These are platforms designed to hold your attention as long as possible in order that they can also use that attention to advertise against. So the algorithms are designed to find ways to hold your attention and they have worked out essentially that by feeding you slightly more titillating variants of whatever it is that you're looking for, you're going to be there longer. And so that drive towards more sensationalist content is, I think, a real one. Except we see this in an anecdotal sense and there is anecdotal level research, but what we desperately need is access. There needs to be access to data by third party. It doesn't necessarily need to be the research community, but it could be a regulator. It could be government.

Taylor Owen: We do that with other sectors.

Sasha Havlicek: Correct. But what we do need to understand essentially is the algorithmic outcome from a public health perspective. What is the impact of that design in relation to how it impacts specific communities or constituencies, for instance? What is the public health outcome of the design of the algorithms as they currently stand? That has to be the first step to having the conversation around what we then do with that. So number one having access to that.

Taylor Owen: Transparency around-

Sasha Havlicek: Exactly, access to that type of data to be able to verify what the impact of that algorithmic design is right now. Then I think there can be a conversation around how one might go about tweaking that, and whether the companies should be required to do that in certain contexts. I think in some contexts we would say no. And in some we would probably opt yes. And in democratic environments there should be a public debate around what that should be, what response should be. There is a big job to be done around data mining and usage and transparency around the way in which your data is utilized. And this, again, I think there's some interesting work that was undertaken in Germany, the German Cartel Agency, the federal agency.

Taylor Owen: Which is quite the name, isn't it?

Sasha Havlicek: It's absolutely.

Taylor Owen: I think competition bureaus around the world are jealous of that name.

Sasha Havlicek: Yeah, exactly. That's their Central Antitrust Authority. It declared the consent that a user gives Facebook to use and combine data across the various platforms that Facebook owns in Germany, a fiction because of the monopoly status of Facebook in that context. And this was appealed now and Facebook actually won the appeal, so it hasn't gone anywhere. But I think increasingly there are going to be questions asked about the way in which we hand over, voluntarily for the most part, our data, and what we really understand of that process in terms of how that's used and how that impacts the world that we see at the end of the day. So I think there's a lot of work to be done in that space. So there's a lot of those structural conversations really need to happen. And this isn't even in the bigger antitrust conversation.

Taylor Owen: This is just, we're just talking about the content side of it right here and the way content is circulated.

Sasha Havlicek: Exactly. It's really about that information ecosystem.

Taylor Owen: So that's the organic piece of that and how these algorithms are pushing seemingly organic content to us and not to us, whatever the case may be in defining what we see and whether we're heard and whether we are seen, but there's also the paid and promoted and targeted aspect of that too. Which I assume is a piece of the puzzle here too, the ability to actually surgically target people with particular messages.

Sasha Havlicek: Yes, exactly. Yeah. And I think that ad transparency is another piece of what we've been proposing in terms of response. I think from an electoral standpoint, I think it's perhaps slightly less important than people make it out to be, but nonetheless desirable. So political advertising and issue based advertising transparency I think is absolutely critical.

Taylor Owen: What does it say about democracy, if we exist in a information environment where everybody knows different things and those things are targeted to their biases?

Sasha Havlicek: It's frightening to me the idea of information segregation and we've started to see this in a number of ways and we see the type of impact that has on communities. And we started to see it actually with the onset of the satellite era and the ability of people to watch only, I don't know what-

Taylor Owen: Cable news that aligns with their ideology.

Sasha Havlicek: And now of course this is so much more of an issue in the digital era. We saw in the German context, people who were voting IFDA were imbibing their information solely from alternative online outlets. Those that were voting mainstream political parties were still imbibing mainstream information news and so on sources. So you start to see this segregation, we see it in America in a massive way, where it's almost impossible to think how you would penetrate that bubble now. What types of means are available for you to penetrate into that echo chamber. That's very, very dangerous. And speaks to acute polarization in ways that I think ultimately degrades our democratic civic culture. Without an ability to talk to each other about difference, there is no possibility for democracy to actually thrive and survive.

Taylor Owen: And it's not just polarization. Everyone wants to bucket it as ideological polarization, but it's not just that. It's polarization of everything we know. It's on any one issue we know different things. And that's-

Sasha Havlicek: Exactly. I think it's a much bigger problem than just the political space, but I think it affects the political space in such an obvious way today that we require a response and it needs to be quick. So I would say I'm a big proponent of thinking in big picture terms around our digital policy strategy for the future because I think we need to be really actively thinking about what is our vision for the internet, for the good, for a good internet? What is the vision that liberal democracies have for the internet? Because all sorts of countries around the world have their vision and they are pursuing that vision quite aggressively and actively and we have none for ourselves. And this is important. We also need to be live to these threats in the here and now so that we don't find ourselves essentially with too few players around the table to have leverage to make this happen.

Taylor Owen: Yeah. And those are difficult things to do at the same time, right?

Sasha Havlicek: Yeah.

Taylor Owen: Just to end, I guess, on a slightly maybe more optimistic note, that if it does need to be a civic conversation about the kind of information environment we want, what does that look like? How do you see that mobilizing? Do you see it mobilizing? Are we starting to see a tipping point where people are actually aware and pushing back against the very nature of the internet they live in?

Sasha Havlicek: I think that there is a growing consciousness among people in many, many, many places now, certainly in many Western European countries, we see today, of the harms of this digital era. And whether they see it from the perspective of the harms to their children, the more visible harms of terrorism, the disinformation, the subverting of our democratic systems, in a way doesn't matter. It's become the issue, in a way, of our time and I think there's a much bigger appetite than policymakers understand, so far, to see bold responses to this and to see leadership in this space. The internet also offers us a tool like no other, of course, and social media in particular, to reach out and build constituencies in favour of certain things. We see good actors particularly poor at doing this social mobilization. We really do need to see much more of that happen. But I would say there's low hanging fruit in the following context in my mind. One is that existing law isn't being applied and enforced effectively online. If we were just to do a scoping of existing law in liberal democracies and look at the obstacles to having enforcement happen in the online world and think innovatively about making sure that we overcame those obstacles, we'd be in a very, very, very much better place than we already are. Number one. Before we even start to build out laws for the internet, new laws for the internet-

Taylor Owen: And reimagine everything.

Sasha Havlicek: And reimagine everything. If we were then to start to head into this much more structural conversation beyond content moderation around specific siloed issue sets, terrorism one camp, hate speech one camp, but really to start to think about these structural issues, again, we'd be potentially dealing in leaps and bounds with the problem. We'd see systemic changes in a very small space of time. And in the longer term we want to see better engineering practices and a culture of responsible engineering.

Taylor Owen: What's being built.

Sasha Havlicek: Exactly. What's being built and what is the impact of it? Lots of stuff is being built now without really thinking through the ultimate consequences. I think now we're in a place where technologists themselves and Silicon Valley is saying, oops, we needed to have probably thought this through a little bit further. That now that attitude and that perspective needs to be built into the way stuff is built going forward.

Taylor Owen: Our ability to do that is dependent on understanding the system. You've been at the centre of that for a long time, so thank you and thanks for talking to us about it.

Sasha Havlicek: Thank you.

[MUSIC]

Taylor Owen: So that was my conversation with Sasha Havlicek, the CEO of the Institute for Strategic Dialogue, and the conversation was recorded in Dublin, Ireland. Thanks for listening and as always, let me know what you thought of today's episode by using the hashtag #BigTechPodcast. I'm Taylor Owen, CIGI senior fellow and Professor at the Max Bell School of Public Policy at McGill. Bye for now.

[MUSIC]

Narrator: The Big Tech Podcast is a partnership between the Center for International Governance Innovation, CIGI, and The Logic. CIGI is a Canadian non-partisan think tank focused on international governance, economy, and law. The Logic is an award winning digital publication reporting on the innovation economy. Big Tech is produced and edited by Trevor Hunsberger and Kate Rowswell is our story producer. Visit www.bigtechpodcast.com for more information about the show

For media inquiries, usage rights or other questions please contact CIGI.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

Opinion

Read