This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.
Joan Donovan: We shouldn't be so fixated on a future with Facebook, that we are afraid to create the rules. We need to reign in power and money.
Taylor Owen: Hi, I'm Taylor Owen, and this is Big Tech. It's only been two weeks since the siege on the Capitol. But for Donald Trump, these past few weeks must have felt like a lifetime.
Wolf Blitzer: We've just witnessed a truly solemn moment in American history. The house of representatives has reached the threshold for making Donald J. Trump, the only president of the United States, to be impeached for a second time.
SOURCE: CNN YouTube Channel https://youtu.be/9wsj6fntPak
“See moment Trump got impeached for second time”
January 13, 2021
Taylor Owen: This seems to be the final nail in the coffin for Trump, many of his supporters and the GOP have turned against him. He's been impeached for a second time and he's finally, finally been kicked off of social media.
In addition to Twitter, Trump has been banned or restricted from Apple, Facebook, Instagram, Snapchat, Google, Amazon, Pinterest, TikTok, YouTube, Reddit, Twitch, Stripe, discord, and Shopify.
SOURCE: The Tonight Show Starring Jimmy Fallon YouTube Channel https://youtu.be/h71BuL86VUw
“Trump Banned from Twitter | The Tonight Show”
January 11, 2021
Taylor Owen: At first, Trump's de-platforming prompted a collective sigh of relief. After four years of using Twitter to incite violence, stoke conspiracies, and disseminate disinformation, kicking him off the platforms just felt like the right thing to do. Among conservatives, there was a predictable first amendment backlash to this, but there were other people expressing their discomfort with these moves as well. People who rarely agree with the commentators on Fox news, like Angela Merkel. Some have falsely interpreted Merkel's remarks as a call for unbridled free speech. But really? She's just calling for government regulation of speech, and she's not alone. In fact, I think the Trump's bands actually distract from the systemic failures they represent as well as the real solution, democratic governance, not self-governance. Few people understand this better than Joan Donovan. Joan is the research director of The Shorenstein Center on Media, Politics and Public Policy at Harvard university. She began her career looking at how white supremacists used online DNA tests to prove their purity. And in the year since, she's become one of the world's foremost experts on online misinformation. Joan spends a lot of her time embedded in the dark corners of the web to see how misinformation spreads, and in the months leading up to January 6th, Joan had been following the stop the steal conspiracy and knew something big was coming. We spoke about the attack on Capitol Hill, and what it tells us about our media ecosystem and how to govern it. Here's Joan Donovan.
Taylor Owen: Right before the insurgency, you tweeted protest as a cruiser poll today, we will witness the full break of the mega movement from representative politics. I wonder if we could start by just talking a bit about how you knew this was going to erupt in the way it did. And what did you watch over the past few months that made you really know what was going to happen here?
Joan Donovan: So, for the last decade, I've studied social movements. And if you study social movements, you can tell three to six months into the future. What are the possible potential outcomes if movements continue in the way that they have been behaving? As stop the steal started to coalesce into a street movement, had rallies, was bringing together people Allie Alexander with Alex Jones, with Nick Fuentes. If people don't know him, he's an heir to the alt-right of Richard Spencer. He's an online streamer. As these influencers started to show up in public space, they started to show up at Capitol buildings and they were showing at state legislatures. This is happening in tandem with the carnival that Rudy Giuliani, had put on display of continuous performative litigation.
Taylor Owen: So this in the moment within the weeks after the election?
Joan Donovan: The weeks after the election. But you have to understand that all of this is really about media manipulation. It's about creating the conditions by which people feel as if, if they don't do something, nothing will change. For movements, the most important moment in a movement is when they feel that there is nothing left to do institutionally, there is no avenue forward by which they can win. And the only other option is chaos and violence. And when movements reach that level of desperation it can happen in two ways. One is calls for accountability, thinking here about black lives matter and all of the different ways in which activists have tried to make police accountable for years in LA. I was part of the black lives matter movement of activists in LA who would wake up at 6:00 AM every Tuesday to go and hold police accountability testimony at the police oversight committee, just time and time again, trying to push the institutional levers. And when all of that fails, movements become very chaotic because it's very hard to harness that despair and grief. The second way it can happen though, is exactly the way that we saw, which just through disinformation. And if you can create the appearance that all avenues have been litigated and exhausted-
Taylor Owen: Right, because that never happened. Right? The election was an institutional pathway.
Joan Donovan: Exactly, but if you have 60 court cases where people online are saying every day they're trying, and Trump is being thwarted. And then this notion of seizure, this like, "We tried the institutional way and it didn't work." We see that get performed by Trump on January 6th, as he gives a very boring speech,
Donald Trump: Our country has had enough, we will not take it anymore. And that's what this is all about. And to use a favourite term that all of you people really came up with, we will stop the steal.
SOURCE: Ruptly YouTube Channel https://youtu.be/os0Gp6lTYqU
“REFEED: Trump to speak at 'Save America' rally”
January 6, 2021
Joan Donovan: And saying this amount of people in this state and this thing over here, and we tried, and we tried, that's just a reflection of the media that people had been consuming online that felt as if the election was being stolen from this man who represented them. He's literally the embodiment of them and their beliefs. And as I was watching all of those street protests, and I was watching what was playing out online, around stop the steal, it became very apparent to me that they were done messing with Mitch McConnell and hoping that Ted Cruz would save the day. And those that went to the Capitol were very clear that they were going there to save Trump. And based upon the tactics of the past, where they were trying to disrupt the certification of electors, were trying to disrupt the vote. It was just very obvious to me and other researchers, that something was going to go down at the Capitol.
Taylor Owen: One of the ways you researched this and the way you described these phenomenon is through life cycles of media manipulation. And I think it's really important, because it treats the ecosystem as the diverse place that it is. Does that framing fit the stop the steal campaign broadly defined. And if it does, can you maybe walk through a little bit what these phases are and how they played out in this case?
Joan Donovan: Yeah, the model's fairly simple. It says that, there's going to be a moment where there's an opportunity and campaigns will be conceptualized, and that people start to plan them. And then the second phase, is when they start to seed that campaign across webs and social. It could begin in a chat room. It could begin on a message board. It could begin in an office park in St. Petersburg. There's a conceptualization process that says, "This is the opportunity." And then you start to see the content, start to be layered online in different forums and in ways. And then crucially is who responds. We have so many models of disinformation campaigns that get littered on the internet and then nothing happens. And importantly for journalists, they have to realize that in stage three, if they choose to respond to something, they may actually trigger more attention and more amplification of that. And for years, research in our field is really focused on the role that journalists play in amplifying disinformation. The fourth stage, for the first couple of years of doing this research was relatively absent, which is mitigation. Not just platform companies refusing to take action, but it was really hard to get people to call things disinformation or misinformation. I'm thinking here particularly about the use of Facebook to spread Holocaust denial. A couple of years ago, Zuckerberg came out and said, "Hey, listen, I know the Holocaust happened, but if somebody else doesn't know, that might just be a mistake and we shouldn't take that down." And so, mitigation is something that we look very closely at, because mitigation actually determines how manipulators adapt and change their targets.
Taylor Owen: And Zuckerberg of course, changed that policy.
Joan Donovan: He did, he did eventually.
Taylor Owen: It tells us many things about the whole system, but it also creates a different strategy, by people who wanted to deny the Holocaust.
Joan Donovan: Exactly he really couldn't understand the fact that people in mass, in groups, use that belief to mobilize each other and to bring more and different, and more toxic Holocaust denial into the world. It was, not to get too sociological, but it's about the replication of these world views and these ideas that social media companies fail to understand, is how important their technology is and that repetition and replication. But when it comes to stop the steal, it was pretty clear over the past few months that Trump was going to, by any means necessary claim that the election was being stolen and, or rigged. His first line of attack of course, was against mail-in ballots. But stop the steal as a campaign, had been online prior to that Roger Stone had registered the domain and it had before 2016 anticipating a Trump loss and a big moment to try to mobilize these MAGA folks. So, the groundwork for stop the steal actually existed, or prior to November 3rd. But November 3rd, you start to see a constellation of folks come together around stop the steal, and the group starts growing fast on Facebook that Facebook eventually shuts it down. There were 330,000 people when I checked in on it. That spread is digitally enabled by the algorithmic recommendation systems, as well as these networks that Facebook had tried to remove, which were these QAnon on networks that had figured out how to quote unquote, Go-Camo as they call it, and ingratiate themselves in other spaces on Facebook, which I keep joking that Donovan's first law of disinformation is that, if you leave this information to fester long enough, it'll infect the whole product. And so the QAnon is a good example of that, which is stop the steal then is being pushed across all of these different networks. And then the mitigation strategy of course by Facebook, was to try to stop them from joining a very large group, which forced this network, distributed model of local stop the steal groups to happen. But through it all, not only is these companies trying to mitigate it, but then you have politicians stepping in utilizing that response as a tool to make people think that they're being silenced and suppressed in some way, and this forces these groups to move into other spaces, most particularly Gavin Parlor. But it's not the case that they're just using Gavin Parlor. I want to dispense with this idea that somehow they move off of Twitter and Facebook and are just siloed.
Taylor Owen: Which is the idea that one QAnon ban moves an entire community off facebook, which it didn't happen.
Joan Donovan: Exactly. It doesn't happen that way. And we have to not get caught up in the corporate logic of the walled gardens here. We have to realize that people use multiple platforms at the same time. And so, as we were thinking about this and trying, and we've been writing it up, it's looking as if it's the biggest disinformation campaign that we've seen through the internet. I can't of anything bigger. I can think of things that are definitely disinformation that have happened in the past, Saddam Hussein had nuclear weapons. That was a lie that was told to justify intervention. But this one is very different, because it's using the logics and the appearance of protest to serve Donald Trump, who's the sitting president. He's not a challenger. And in that way, he's not an average citizen, and he's not someone that's just having his accounts taken away from him. If this was the president of another country that was doing this media manipulation the UN might step in.
Taylor Owen: You mentioned the media framing here, and the rule that amplification plays in the media. A couple of years ago in the lead up to the Canadian election, you were kind enough to join us for a training session we are doing for Canadian journalists on how they should cover or not cover disinformation in the election. And I remember you saying in response to a question that, well, maybe you shouldn't cover it. Maybe this stuff just shouldn't be talked about, which is almost common knowledge now in the research community around this stuff, but in a room full of journalists, it doesn't go over well. It challenges their core value proposition and sense of identity, which is we uncover truth that they can be given the light of day. Right?
Joan Donovan: No, and that's very true. When I give that advice about think twice about coverage, I'm very specifically talking about extremists and white supremacists. The proud boys contingent of the MAGA coalition as a really important example here, which is to say that there are different ways of covering them. And so, danah boyd and I have written about this notion of strategic silence, which is important in the sense that it's strategic, which is to say that you cover things using your perspective, your words, and bringing into the fray people who are impacted by white supremacist, violence, and rallies, but don't uncritically hand the microphone over and say, "Well, tell me what bothered you as a child that left you open to these ideas and made you think that you're superior." Because, they're going to start from the very get by saying, everybody says, we're a white supremacist organization, but really we're not. We're just guys who like to drink, and like our women in the kitchen, and don't think immigrants should come, What's wrong with that? And it just gives a whole different platform to their ideas, which is what's happening right now is you have extremists trying to get journalists to cover their different protests. And some of them are six guys in Idaho at the Capitol, looking around being like, "Well, if we can get media coverage, maybe we can get the ground swell of support." And we have to actually approach this now, by saying we have power and the things we call attention to matter. And therefore, when we cover these things, we have a duty of care to our audiences in providing the context and exploit explanation. I'm giving this advice to journalists in the midst of their editors saying, "You can't use the term white supremacist, you can't mention white nationalism, even in the AP guidelines, things start to shift around calling people the alt-right. So, when we're discussing this, it's really about this moment in journalism, where there's a battle between editors and shot callers that are asking journalists to tell these stories. And then journalists who are really caught off guard by the tactics that these groups are using in order to get attention. And that's why it's important that we understand this relationship. Now, what happens when Trump gets involved though, is a completely different order of magnitude. Because, by virtue of being a U.S president, he sits at the centre of not just media in the us, but international media. And you can't ignore that, but you surely have to not play into what him, and Giuliani, and Steve Bannon, and others had really tried to organize, which was a mass disinformation campaign, first against mail and voting, then they tried to plant the bite and laptop story and New York Times, and Wall Street Journal and other Washington post decided they were not going to cover it in the way that Trump media was trying to position it. And then we get this iteration of stop the steal, that really just gets wild the closer you look at it.
Taylor Owen: That's why it's showing they're framing it as this almost slow build of this big lie. They knew this, they needed that trigger of a big significant wrong that could mobilize, and they tried all these strategies to let that take hold. The platforms responded at the very end of that process, by cutting off access. Was at the right time to do it for them, or where do you stand on that debate about when the platform should have cut off accounts?
Joan Donovan: I think we needed a set of rules that we never got related to platforms and the responsibility of politicians. Different users provide different uses of the same technology, which is to say that, if your social media use is largely political, you are a politician. You use social media strategically for political wins. If you're a marketer though, and your goal is to make money, then the strategic use of social media is about incentivizing people to buy your product. Different users have different use cases and also then have different incentives for using social media, and bending it to their will then has a lot to do with how flexible the rules are. But Facebook did a very particular thing, which is they created a carve out for politicians that said that they were not going to apply rules to politicians, and that they wanted people to see politicians for who they were, quote unquote, warts and all. And this I think was a strategically bad decision, because what they failed to account for more that someone has political power is that they also have network power. When you have power and a network, you can direct people to do different things. You can organize large scale protests, which has been a virtue for many about the internet is that relatively anonymous individuals can use the internet, and technology, and infrastructure, in order to coordinate, organize, and plan large scale social movements, and rewarded that use case with lots of praise over the years, everything from occupy, a black lives matter, to standing rock. It's been immensely transformative, but in the wrong hands for the wrong ends, it's a different technology entirely. It's a different infrastructure.
Taylor Owen: It seemed Twitter in their post about the ultimate account ban, signaled that intentionality of users. And they said, "Look, we're not just looking at what he said. We're looking at how he wanted it to be interpreted, and how it was received and interpreted by others." Is that a move in the direction you're talking about, where when we look at speech, we need to look at-?
Joan Donovan: Yeah. We have to look at the context and the consequence. The rubric of incitement is important here, which is that in the midst of the siege on the Capitol where you have people who are not just... We focus on death, but how many people were injured? How many reporters were punched and kicked and dragged to the ground? how is our government going to come together across party lines ever again, knowing that Republicans were in favour of not certifying the electors. These are existential questions, but the damage that they cause stems from that moment that I described earlier, which is that Trump was airing the grievances and then had told people they had no other option than to make the Capitol hear them.
Taylor Owen: I think the focus on Trump accounts and the take down of most egregious content ends up being pretty distracting in a lot of this conversation. Some of this stuff is clearly illegal. It should be taken down. There's not a lot of debate about that. It's about capacity and scale and ability to do it. The far bigger problem as he alluded to, is the 99% of harmful but not illegal content that ends up creating that network effect that causes harm. How do you think the platforms are dealing with that right now? And where do you feel their approach is going wrong here? Because they're clearly not managing it effectively.
Joan Donovan: I hesitate to give any free consulting these days. My attention is much more on what everybody else can do, which is to say that community based organizations, civil society, organizations now is the time for a strong academic, a scientifically informed empirical look at where we went wrong. And what other levers of power need to be brought in to correct for just the enormous power that these companies have been able to have by virtue, at least in the United States of section 230. Section 230 is largely a policy of decontrol, it says, there's going to be this industry, we want to open it up and really reduce impediment to innovation by saying, we're going to get rid of liability for these companies. And in many ways it creates path dependents, or these companies who then see content and the generation of content as a way to bring networks together. And which is what they then monetize, which is the network ties between people. And I don't see a world in which we don't look at those network ties and the money that is generated off of them, off of the data that they generate, off of the side markets around advertising and say, "Okay, if your platform serves 500,000 people, you need a policy about content moderation that ensures that incitement, harassment, hate, pornography." These things are not going to be profitable and include disinformation in that, especially as a carve-out for politicians who will have to have different rules of engagement on these platforms going forward. Which is to say I'm disappointed, everybody that Napster doesn't exist anymore. I still got a lot of those MP3s and any lawyer wants to come for me, come for me, but none of them are Metallica. I can promise you that. I like some hard rock but, I bring that up to say that we've been here before.
Taylor Owen: These things come and go.
Joan Donovan: They come and go, and we shouldn't be fixated on a future with Facebook that we are afraid to create the rules we need to reign in power and money.
Taylor Owen: And fetishizing their power just further entrenches the status quo. Right?
Joan Donovan: Yeah. What do you think about Facebook saying, we're acting on real-world harms. This notion that somehow the internet in what happens on Facebook isn't real-world, and there's always some interesting connections. I don't know if you know this, but one of the people behind perverted justice and the groups that brought that To Catch A Predator or TV show into the world, Del Harvey is in charge of Twitter's Trust and Safety network.
Taylor Owen: I didn't know she was part. Wow.
Joan Donovan: She was part of those groups. Yeah, it's interesting to think about. It's like, Here she comes from this group, where she's one of the people that actually goes to the door and tells the predator could come on in, and spends a lot of time online prior to that trying to enroll predators in this honeypot. And I think when I think about that connection, I think about this notion of real world harm, which the notion of real-world harms then becomes very individualized. It becomes about individuals in their offline worlds being attacked or somehow some predator gains access to them.
Taylor Owen: Well, a single person being attacked by something single. So, two singular agents essentially. Which is a reductionism that just doesn't work anymore.
Joan Donovan: Yeah. And when you think about, Well, what happens when you scale that interaction when you scale that? One of the things that was always really interesting to me about To Catch A Predator, is how it worked every time. It was one of the most perfect television shows in the sense that there was never going to be a lack of people to catch, which is to say that, the setup conditions online matter. And when people don't think there's oversight, they don't think there's consequences. They're willing to take these major risks, and how I would translate that into the way the policies seem to work is that yeah, unless you're out there getting arrested for doing hate crimes like some of the proud boys have been arrested, not necessarily for hate crimes, but for riots. Then they're put into a different category of dangerous organizations for instance, and then action can be taken. But I just wonder what the service model is, if you were to think about, like, what is it if you were to build these systems not for growth and openness, but as Twitter said, healthy conversation or more pro-social values, and you were to build in a kind of moderation that didn’t incentivize Godwin’s law about how everything online is going to eventually turn into Nazism? What if you took those things seriously, and just changed the way that we thought about platforms, as a service, that wasn’t about scale?
Taylor Owen: Yeah. One thing I'd love to get your thoughts on is that, it feels a lot of this research agenda emerged obviously after 2016, and it was really focused on media effects and causality between... I know we got distracted by that, on to what effect did manipulation have and disinformation have on an election. Do you think the community's moved on from that? And how would you assess that focus on mis and disinformation now by the community?
Joan Donovan: This line is so fraught, because the people who were researchers that were really pushing this idea, that it's more than just social media. Okay. We get it. And it's okay to say "both and", this is the feminist methodologist in me, is to say it's an ecosystem wide problem. And assigning blame to any one particular industry then can in some ways overemphasize and release other industries from any culpability. And I'm thinking here about research that really tried to link Trump's victory solely to the news. And Fox news in particular, and echo chambers, and research that I've tried to do in this space is more about the rationales by which people were using to make decisions about what issues were important to them and how to vote. But also, voting isn't the only political touch point that people have with the world. People carry out lots of political acts, big and small. And there's definitely a lot more that this field can learn, if they were to look at disinformation campaigns that animate people's sensibilities, that get them to move off the couch and into the world. Which is why we focused a lot on looking at these reopen protests and how they brought together, both militia groups with anti-vaxxers with the QAnon crowd. I think everybody knows that people form communities online, and it's in those bonds and in those spaces that people make decisions about, "Do I go to this protest or not? Do I spend $400 to take a flight to DC to save Trump from the Democrats and Republicans?" Those kinds of decisions we can look at and we can understand them. That's a long way around of saying that, this field has been dogged by these debates about impact that are potentially really thwarting our ability to see that people do take action in the world and it's those actions that we should take seriously. But that speech as well, especially when people repeat things over, and over, and over again, as they do online, that too is incredibly important to study and understand, because I don't think people make much of a distinction between this is my online account for Patriot2467150 and my life.
Taylor Owen: Yeah. One of the interesting things I thought in the last two days, has been this international leader response to the Trump Twitter bans.
Steffen Seibert, Spokesman of the German Federal Government
SOURCE: Ruptly YouTube Channel https://youtu.be/rf-_Q7grKu0
“Germany: Merkel finds Trump social media ban 'problematic' despite providers' responsibility”
January 11, 2021
Taylor Owen: And I think Merkel and a bunch of French politicians pushing back against it. And I think that's being interpreted as them taking the sides of platforms and the more free speech absolutist perspective. But I actually think it's something very different. It's them saying, "This is the result of these companies being ungoverned or being left under governed." And they're calling for more, they're calling for more governance and government. Which is I don't think it has been interpreted in the U.S. right now, for example. I'm wondering, how you think our policy conversation needs to change in a way that it can get at structure and get at some of these complexities of the way the network functions and of the actors in it, and of the design of the system, all the things we know to be true. How can the policy agenda adapt to that? Because it's not there now, I don't think.
Joan Donovan: It's a good question. I’m inclined not to believe that these companies really want legislation. I’m inclined to believe that there’s a move happening where, on the one hand, they’ll say, in a very chaotic moment, “Do whatever you got to do. We’re over here — go legislate.” And then, in the background of all that, they’re employing teams in DC, and other people, to move the needle away from business models, away from oversight, and to offload the responsibility on to either other professions, like the whole fact-checking world. So that the offloading of responsibility on to other professional sectors is something that I talk about as a true cost of misinformation, which is to say, that we could actually figure out what the costs are to journalism that has to pick up the slack. The other part that I'm inclined to think about when they say, yes we want this but no, not that way, is to look at the openness of the advertising systems and how they're used adversarially and how we've seen the weaponization of these advertising systems. And when they do roll back people using them, we get different effects than when their advertising systems are fully open to political operatives. And The Markup has some really great research about this, using their Citizen Browser Project, where they were able to say that the moment that Facebook reopened the pathway for political advertising, there was an attack on Warnock in Georgia around a cheap fake campaign, which is basically a very short clip of him saying, "God damn America," from 2013 where he was quoting someone else. That is to say that I think, the scale needs to be the target of the policy, as well as the business model around openness that produces many of these ill effects. And the fact of the matter is that when these companies say, "We welcome regulation." They mean of a certain kind and type, and one that specifically doesn't require them to either break up their businesses, or create limits for the profit that they take in or profit sharing downstream.
Taylor Owen: Facebook's campaign, which is advertised everywhere now in newspapers, online, on all the main political newsletters, calling for regulation is very clearly more effort going into it, being a PR campaign than a meaningful reform. And you consistently hear, "Well, we want smart regulations, or we want-"
Joan Donovan: Yeah, but that's a move. If you offload responsibility and say, "Well, these politicians wouldn't do it either. We tried, we were just doing our thing and then-"
Taylor Owen: And we've been asking for it.
Joan Donovan: We've been begging, I've been begging for you to clean the dishes and you won't do it. And dishes are still dirty though.
Taylor Owen: And you blame me for breaking the plate.
Joan Donovan: Yeah. Right. Which is why I'm just a lonely researcher over here, all over my silo, unable to see beyond my own self-interest. But I do know one thing, which is that all of the evidence is there, that these companies are trying to blame everybody else for the design. It's actually a design problem. The other thing is, you imagine technology to be fast, and flexible, and responsive, but it's really not. When you layer on so much bureaucracy, as well as a CEO model that has more of a charismatic figurehead model than it does a programmatic approach, you end up in this mess where, I make this joke a lot but it's still funny to me, which is Zuckerberg and Dorsey are the highest paid content moderators on the planet thanks to Trump.
Taylor Owen: And where was Dorsey, who is in the Polynesian Island. Wasn't he making that decision too? Not bad.
Joan Donovan: Yeah, might as well like getting him out of his sleep chamber and get him to comb that beard, and then get him to read some of these Trump tweets and make some decisions on, "Well, does this stay up? Does it get a label? Does it get this? Does it get that?"
Taylor Owen: It's a bit easier than talking about the very incentive structure of a platform with millions of users.
Joan Donovan: Would you rather have four jets and own an Island, or not have the entire world angry with you about your inability to reign in your technology, and apply your own policies to people who are using it to overthrow the U.S. government. The whole thing sounds like a Will Smith movie. If this movie were to be made today, it would be like "Well, he seemed a little off, but he's charismatic and it was on the apprentice. How dangerous could he be?" And, you have to stop me, Taylor, I will keep going.
Taylor Owen: Stop, stop, stop, stop. Stop. One last question for you. One last question. You testified to Congress recently.
Joan Donovan: This emerging economy of misinformation is a threat to national security. Silicon Valley corporations are largely profiting from it, while key political and social institutions are struggling to win back the public's trust.
SOURCE: Energy and Commerce Committee YouTube Channel https://youtu.be/I7jXjpFw_ck
“Americans at Risk: Manipulation and Deception in the Digital Age”
January 8, 2020
Taylor Owen: But you ended by saying, what would it mean to uninvent social media? What did you mean by that? And what are the stakes in that?
Joan Donovan: I'm a big, big fan of Donald McKenzie and the STS scholarship on uninventing the bomb. This idea that we brought a technology into this world, that means we can end this world. That's what the bomb means, the biggest bomb means everybody dies the mother of all bombs. And so, I think a lot about the social shaping of tech and this idea that we don't really often think about how to roll back innovation. We don't think about how to uninvent things. We don't have an imagination for a future without something that has been poorly designed and threatens our entire existence. And in the case of war technologies, of course, it's a little bit clearer where we go, which is to say that we have auditing systems, we have negotiations, there's peace treaties, there's all kinds of ways. An immense amount of hand ringing and bureaucracy that is applied to nuclear weapons technologies, because this technology exists now and was brought into this world. And that it is so dire, and so terrible that again, in the wrong hands could cause massive, massive pain. And so, when I think about how do you uninvent social media, it's thinking with that lens, thinking with those ideas, how do we approach regulation and prioritize those who are going to be harmed killed by, unbridled social media, open and working at scale. And you end up in this position, because you don't imagine a world without the technology as it is designed.
Taylor Owen: And add it as it is today, importantly, right? Because these are constantly evolving, but we always take the current moment as the baseline.
Joan Donovan: Exactly. Exactly. In the current moment right now, is that we finally have seen the important impacts of openness and scale on the public, which is to say that, looking back Charlottesville looks like a precursor and ominous warning. If you don't do something now, something worse will come moment, only because the future has to happen, it has to play out. And for every person that could have made a decision and didn't, or tried to make a decision and was thwarted, that kind of inaction is so important for us to understand sociologically, as the rationale by which we will then get into another situation like this. When these groups figure out how to remount an attack with newer and bigger consequences.
Taylor Owen: On that note of optimism.
Joan Donovan: No, I mean optimism, I got cats, I got names, I get all kinds of things to keep me optimistic, but not politics.
Taylor Owen: That was my conversation with Joan Donovan. Big Tech is presented by the Centre for International Governance Innovation, and produced by Antica Productions. Please consider subscribing on Apple podcasts, Spotify, or wherever you get your podcasts. We release new episodes on Thursdays every other week.