This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.
Maria Ressa: These algorithms are created by human beings with goals and biases, and they're built into it. And the world that they've created has killed democracy in countries like mine, has made journalists vulnerable in ways that I've never been. I'm coming up on my 35th year as a journalist, and I've been a war zone correspondent, and I have never been in as much danger as I am today.
David Skok: Hello, I'm David Skok.
Taylor Owen: And I'm Taylor Owen.
David Skok: Welcome to Big Tech.
David Skok: Two years ago, Time Magazine named a group of journalists as its person of the year. That group included Jamal Khashoggi, two Reuters reporters imprisoned in Myanmar, and the staff of the Capital Gazette newspaper in Maryland, where five people had been killed in the summer of 2018. Time dubbed them the guardians in the war on truth. Of those guardians, six were murdered and only one still has their freedom. That freedom is now in jeopardy.
Ted Te: So the court this morning handed down its decision in People versus Santos, Ressa and Rappler. Judgment is a judgment of conviction or both Maria and Rey.
SOURCE: Bloomberg QuickTake News YouTube Channel
“Philippine Journalist Maria Ressa of Rappler News Site Convicted of Libel”
June 14, 2020
David Skok: On Monday morning, Maria Ressa was convicted of cyber-libel and sentenced to a maximum of six years in prison. It is widely viewed as an attempt to silence her, and is one of a number of charges that could see her spend the rest of her life in prison.
Taylor Owen: Maria spent more than 20 years with CNN, and then founded the Filipino news site Rappler in 2011. After Rodrigo Duterte took office in 2016, Rappler published a report called Propaganda War, where Maria and her colleagues alleged that the president had weaponized social media in order to sway public opinion about his government's war on drugs, a war that has led to tens of thousands of extra-judicial killings. That made Maria the target of Duterte's administration.
Maria Ressa: I appeal to you, the journalists in this room, the Filipinos who are listening, to protect your rights. We are meant to be a cautionary tale. We are meant to make you afraid.
SOURCE: Bloomberg QuickTake News YouTube Channel
“Philippine Journalist Maria Ressa of Rappler News Site Convicted of Libel”
June 14, 2020
Taylor Owen: We sat down with Maria 72 hours before the verdict. We talked about what this trial means for the Philippines, and how the decisions that Mark Zuckerberg is making in Silicon Valley are quite literally a matter of life and death in the global South.
David Skok: Maria, obviously I've seen you speak a lot, and you have done so much, and been so outspoken, and have shown such strength. It's Friday for you right now. We're 72 hours away from this. If I may ask, how are you feeling?
Maria Ressa: I've gone through a whole range of emotions. Two weeks ago, when I got the notice that the verdict, the promulgation date was set, that was the first day that the lockdown in the Philippines, which was among the longest in the world and among the most militaristic, it was the first day that the lockdown was eased just a little bit. And the first thing that I get is a court notice that we have to physically appear in court. And I went from, "Well, it's good. We can get it out of the way," to, "Oh, no, I could go to jail. Oh, gosh." How am I feeling? I think resolute. I chose this road a long time ago, and I know that there are consequences for every choice we make. I know this is the right choice to make, to stand up and challenge any kinds of attempts to limit the rights you have under the constitution. So, a few weeks ago, I was writing a commencement speech for... I gave the speech at Princeton University, and I told the graduates to think about three things because they're standing on the rubble of the world that our generation, I'm a little older than you, that my generation created. I gave three things, but the second lesson that I gave was to embrace your fear. That's how I've lived my life. I learned that when I was a kid, that you are your own worst enemy. If you embrace your fear, whatever that is, you can rob it of the sting it has to prevent you from seeing clearly, from moving forward. And that's exactly what I'm doing. So in my head, I have imagined the worst case scenario. And I have tried to imagine step-by-step what I would do. And then, when you live with this constantly... It's been four years of attacks, right? When you live with this constantly, and I talk about it with my co-founders in Rappler, there are four of us, we're four women, all about the same age, and it's kind of like gallows humor now. If the worst case scenario happens, if I do have to go to jail, then I said, "Please..." Somebody has already volunteered to bring me sheets, someone will bring the fan, and someone will bring me food. So yeah, it's gallows humor, isn't it?
Taylor Owen: This trial isn't just a trial about a particular incident. It's happening in a political and a media and a technology context that you've shed light on over the past two years. And it seems like that started with Propaganda War, the report you wrote about Duterte and the way he had used the media environment, and the treatment of the press, and the way it leveraged technology to shift public opinion about him. You called out his weaponization of social media, very powerfully. Can you explain what he did, and how he used the social media tools that he had access to to convert public opinion?
Maria Ressa: We've seen this happen in the Philippines. That was in 2016 when I wrote that piece, Propaganda War: Weaponizing The Internet. It's part of a three-part series. The second was, how Facebook algorithms... its impact on democracy, right? I was just horrified at what I was seeing. It was like, "Oh my God, this is a possibility that could really kill democracy." And it's heartbreaking to see it over the last few years unchecked, largely by Facebook, and continuing. It is part of the dictator's playbook. What happened? At the beginning when Facebook and social media enabled and gave more power to voices that you wouldn't have heard, well, that kind of power after Egypt has now been harnessed by governments. And when governments have infinite resources, they can control that landscape. Let me talk to you about what's happened in the Philippines. In the Philippines, Facebook is essentially our internet. A hundred percent of Filipinos on the internet, you're talking about 71 million people, are on the internet and are on Facebook. Right? What we saw was Mayor Duterte was the first politician to really use social media well, and to win the presidency with it. What happened is that they took the campaign machinery they built, they had distribution network, because it's a bottom up and a top down approach, they divided their supporters into four geographic groups. And every day the campaign team would tell them, "Here's the message of the day," and then bottom up, the distribution network would take that message, "The sky is orange," and you would then personalize that and distribute it to your network. That idea, after he took office, when the drug war began, and it is a brutal drug war, with Human Rights estimate tens of thousands have died in this, they then took it and expanded it. That's when it became weaponized. And it is exponentially taking a critic... If you're targeted, you will be pounced on. I was counting at least 90 hate messages per hour. Because at the beginning I was responding, and then, all of a sudden I realized they're not interested in talking about it. This is meant to pound me to silence. So that's the first, pound the perceived critic to silence, intimidate them to silence so that that narrative disappears. And then the second impact is to astroturf their messaging. "President Duterte is the best," "he is the best president this country has ever had." "The Pope said, he's the best president." So these lies laced with anger and hate spread fastest on social media, and it creates this bandwagon effect. You throw in the messaging. You target the people, and real people begin to believe it. Because in Facebook, a lie told a million times becomes a fact. And then, what you wind up doing is, if you were pro-Duterte, all the pro-Duterte people, as it grows, moves further here. If you're anti-Duterte, as it grows, you move further here, right? We've seen this happen in every part of the world. Filter bubbles, and then you tear apart society, and everyone debates the facts. This group, because they don't see any opinion that goes against theirs, and now we're debating what the facts are. And that is the biggest problem I see for integrity of anything. If you debate the facts, you can't have integrity of markets, you can't have integrity of elections. This is democracy's death by a thousand cuts. So that continues. We have complained to Facebook. Rappler is one of Facebook's fact-checking partners, one of two Filipino fact-checking partners. Oh, here's the last part. Before the campaigns began, the president of Cambridge Analytica, Alexander Nicks, came to the Philippines, was photographed with some of the campaign people of then Mayor Duterte, and they claimed that he just came to visit them. He was here to talk about Cambridge Analytica before Cambridge Analytica was infamous. But the whistleblower, Christopher Wylie, actually told me that the Philippines, like other emerging markets where the laws are lax or non-existent, that the Philippines was, he called us, the Petri dish. That they tested these tactics of mass manipulation here, and if it worked they ported it, and that's in quotes, they ported it over to the West, to the US, to Europe, Canada, there you are. But it kind of makes sense, because in the Cambridge Analytica scandal the country with the most number of compromised Facebook accounts is the United States. The country with the second most number of compromised accounts is the Philippines.
Taylor Owen: How deliberate do you think this was? From the beginning, was destabilizing truth the goal in sort of a typical autocratic fashion, or was it just a side effect of seeking to control the discourse on these platforms?
Maria Ressa: You know, I think the masters at this were Russian disinformation networks. I think they hit a jackpot. They didn't realize how good it would be. And part of the reason that jackpot happened was because of the inaction of American social media platforms who say they espouse free speech, right? And the values of democracy in the Philippines what I would say is, that because Duterte's campaign didn't have the war chest, the money that his other... He was one of five presidential candidates. He won 16% of the vote. He wasn't elected overwhelmingly, but social media was a huge factor in his win. I think his campaign team realized its value. Then after he won they actually made an announcement. They were so happy about the impact of social media, and they said they would maintain it, that they would continue to grow it. And at that point in time, I didn't know what that meant because what do you do with this? And then, when I realized that it was the manipulation of people, that this is propaganda X X X to the nth degree. And here’s what the reality is for social media platforms, right. Regardless of how they paint themselves, they are behavioural modification systems — that is what they have become. They are optimized to serve our data, and we put our data inside these platforms. But then machine learning and artificial intelligence gets it to a point where they know us better than we know ourselves, better than our most intimate relationships. And, for a price, whether it’s a company or a government, anyone who pays to give you a message at your most vulnerable moment, that is when it will serve it. It changes behaviour. It does. And one of the things we do know is that this kind of exponential hate online translates to real-world hate. Translates — online violence turns into real-world violence, right? I mean, let's talk about the United States. We know the Mueller report said this, right, that Russian disinformation networks targeted the 2016 elections, and their most targeted messaging were race relations. They were on both sides of that fracture in American society and they pounded it open. That kind of pounding has now played out in the United States in a way that would, I think, would not have been possible without these technology, and without a leader. And that's the other part that's happened all around the world. Facebook has enabled the rise of these populous authoritarian style leaders who are then able to gain more control as society gets further splintered apart and then to use formal powers given to them by government.
Taylor Owen: You hit on something that I really struggle with, which is to what degree this is a system that's being abused, a flawed system that's being abused for autocratic or illiberal reasons. Or if that illiberal outcome is actually endemic to the system itself, right, that the algorithmic system, the modeling based on engagement, the incentives that are embedded in the system take us to that place almost necessarily. Do you have any thoughts on that? Is the system itself designed to take us to this place?
Maria Ressa: Yes, I think so. And I'll tell you the reasons why, right? First of all, the optimization. When you ask a tech person, what do you optimize for? They optimize for growth. Someone like Tristan Harris will tell you that this is by design meant to polarize society, because... Let's talk about one very simple decision made by a tech person, right? Sorry, I'll be a little bit geeky here. If you think about how they chose systems of growth, how did they choose to grow? Well, they chose to recommend friends of friends, just that one choice, and they closed the triangle. But by doing that, that one choice, they actually then built the polarity and the filter bubbles into the system, because friends of friends in the real world, you know, we are exposed to a lot more views than we are now exposed to in Facebook because of that one choice. All right, so that's number one. And then the second one, is it manipulates the worst of human nature. It is built for that. Anger spreads fastest. Anger and hate spreads fastest. And that's part of the reason that it's an existential problem for journalists, because you know that we spend our entire careers learning to tell stories so that boring facts matter to people. But now we're fighting narratives that are lies, that are meant to spread, that are designed to spread in this system, and it's unchecked. So the other by-product of this, is that in the context of our world, as the tech platforms gained more power... In the old world, news organizations had both content and distribution powers. We created based on facts. We were accountable for the facts. If we were wrong, we could get sued, and we distributed this. But then the tech platforms took away the distribution powers, essentially the gate-keeping role, which turns out is far more powerful than we thought, because it protected the public sphere. And when the tech platforms took it, they abdicated responsibility for facts, at the expense of growth. That then made facts questionable because the filter bubbles pulled further apart, and no one, it's like there was no adult in charge, that everyone was given a gun and they said, "Go ahead, shoot." Then without the gatekeepers, the integrity of anything... I mean, the reason I became a journalist, is I know that information is power. David, you know that. And now, information is weaponized at the very basic foundation, because the distribution of news is out of our hands. And I think that's the question that really is the question of... I'm sorry, that's my niece. I think that's the question that the social media platforms face, because they claim that they don't want to be the arbiter of truth, but the reality is they already are. These algorithms are created by human beings with goals and biases, and they're built into it. And so to say that they're not is disingenuous. And the world that they've created has killed democracy in countries like mine, has made journalists vulnerable in ways that I've never been. I'm coming up on my 35th year as a journalist, and I've been a war zone correspondent, and I have never been in as much danger as I am today.
David Skok: For Zuckerberg... And I wonder, I'm actually very curious from your lens how you feel about watching what's happening in the United States right now. Yes, Trump and this march towards authoritarianism, but more so how Zuckerberg is now seeing it on his shores. You're seeing all the platforms in Silicon Valley, seeing it on their shores and how they're responding very differently. Last week was arguably the most important week of Mark Zuckerberg's professional life, and he stood his ground on the free speech principles, whereas Jack Dorsey in Twitter said, "We're going to start flagging things," and obviously they're now dealing with the repercussions of that decision from a policy and a business perspective. What has it been like watching that unfold from where you are, where you've kind of, you've seen the future.
Maria Ressa: Yeah. I am shocked because I've met Mark Zuckerberg. I think he's a bright young man, young because I'm old. I think he's on the wrong side of history. And it is infuriating that he keeps saying this is a free speech issue because it's not. And I'm going to quote a comedian who knew this, Sacha Baron Cohen said, "Freedom of speech is not freedom of reach." And it goes back to that. Who cares if a journalist says what the facts are, tells a fantastic story, if it doesn't get distributed as much as the lie. And this is documented in many research, a lie spreads, the fact check that comes after gets to maybe a very, very small fraction of the people who now believe the lie, right? When Mark Zuckerberg was in front of Congress in 2018, and he said that it would take at least five years before Facebook could deal with it, I threw a shoe because in countries like mine, in the global South where people are dying because of what Facebook is failing to do, it can't be years, it had to be days. Again, I speak as a frenemy of Facebook because we're still their fact-checking partner, I think that goes to both of our interests. But at the same time, I am appalled that genocide has happened in Myanmar, and there has been no accountability. I am appalled that the legal attacks against me was astroturfed on Facebook. You know, what the government tried to do is to replace “journalists” with “criminal,” right — just even that kind of propaganda that Facebook has enabled. It’s physically, morally, intellectually, spiritually damaging to me. Facebook has taken away the rights, the kinds of protections, a journalist is guaranteed under the Philippine Constitution. And they’ve stripped us of that. So, these are things, this has a daily minute-per-minute impact on us — you can see I’m getting emotional here. So, you know, in 2016, Rappler fought impunity on two fronts, the drug war of Rodrigo Duterte and Facebook. Impunity — this must stop or our democracies will die. Actually, almost everything will die. How can you have integrity of markets if you don’t have facts? How can you have integrity of elections if you don’t have facts?
David Skok: Yeah. We're starting to see employees speak out at Facebook, which is not a tech company traditionally known for that kind of thing. And I start to wonder about the fundamental business model, and what their end game is. In your dealings with them, do you know what their end game is, apart from just making money?
Maria Ressa: No, and I wonder about that. I actually met Mark Zuckerberg in 2017. He's really, really bright, his agility of thought. And I, at that point... So it began in the Philippines in 2016. I had hoped that they would tackle this problem the same way they tackled the move to mobile. I don't know if you remember that. Facebook was a little bit late moving to mobile. Then when Mark decided we're going to go mobile, they did it in two years and they became market leaders. Well, I was hoping that was the way they would deal with this problem. But the kinds of the denials, it's really shocking to me because it shows either greed or a kind of tunnel vision. So here's the problem that I see, in the end, this then calls for legislation. Actually, one of the guys whose work I really loved is Jim Balsillie, a Canadian. He called data... He said, data is not the new oil it's plutonium. So this goes back to this, right? I think that what we're seeing Facebook do, is prioritize political relations above the user safety. I can appreciate the difficulties of what they're facing in the United States. And they, unlike journalists, they're easily spooked. They did have journalists on for a little while, and then when they were accused of being against the conservatives, they took them out, right? This doesn't help when you're a gatekeeper. You have to have a set of standards and principles. What is the mission? And I think that's one of the things I've been pushing, is like, so what do you guys stand for? I asked the whistleblower, Chris Wylie, this, and he said, "Oh, you mean, tech people? Optimization, growth, that's what we stand for." I said, "Okay. Well, that's the problem." I do have a solution, I think. We've been thinking about this, and I've been discussing it with a lot of really smart people. Part of this came out of something that Sinan Aral... He's with MIT. He's got a book coming out. It's called The Hype Machine. We were saying that, why not make every user... So who owns our data? In fact, there's a new social media platform, MeWe, where you own your data. You own your data, right? So who owns the data that we put into these social media platforms? Right now, Facebook is taking it. It's aggregating it and then putting it together with everyone else gives it that network effect, right? But what if the legislation is about giving the user the ability for data portability and social network portability, so that we can then fight for our rights, so that there is an incentive for these social media platforms to protect the users, right? Because if we can do that, if we have data portability and we can take our social network with us, then we can, if we're not being protected, we can pull out and transfer it to another social media platform. That would also get rid of a lot of these issues of monopoly and anti... I don't think an antitrust law would work, because breaking it down doesn't actually address the problem that's embedded in the design of this platforms. Imagine if we can take our data. If we don't feel protected, we can take it out, and then we can tell our friends to take it out. I bet you, that the platforms would protect us a heck of a lot more than they do today, because every incentive they have is to continue going down this dystopian path, right? When Facebook spent more money to try to protect data privacy, the market pummeled them, and they will point that out. So I think they need a nudge. Look, I embraced Facebook. That was the foundation of the growth of Rappler. So I know its best, and I also know its worst. And I know that what I hear from someone like Mark Zuckerberg... I mean, it will destroy our democracies. And it shouldn't happen that way.
Taylor Owen: I don't know if you saw last week or a few days ago, Rana Foroohar had a great column, I thought, saying that the objective of Facebook was power, that when you get to a company of that scale, and you need consistent growth, and you need powerful monopoly position to achieve that growth, you have to align yourself with political power. That's one thing to be doing that in Western Europe or even in the United States now. But a lot of that debate ends up being very US-centric or Wester-centric. And aligning oneself with power in much of the world is a very different thing, and arguably, runs directly counter to some of the principles these companies supposedly were founded on.
Maria Ressa: I think that's a great point. We have the data in the Philippines, right? Part of my certainty in this is because we have the entire information ecosystem, and we can see the networks. As fact-checking partners, when we fact-check and we prove it's a lie, we can see the social networks that continue to spread the lie, that are spreading it and are recidivous, that's what they do. So Facebook sees that as well. So to say that, "Why does Facebook not make a statement?" Like the majority of the accounts of the lies that are fact-checked support political power, support Trump. Russian disinformation networks, they'll take it down, but they don't say very much. But here's the thing for us, everything flows downhill. And we in the global South, it becomes a matter of life and death for us, what Facebook does. Let's talk about the promise of it. It is the first time globally that we have all been connected in this way, on one platform. But that also means that the lie that originates in Manila spreads instantaneously to New York or to Europe, right? And that's why you can see things, like for example, the traditional boogeyman of democracy is George Soros. No one in the Philippines had heard about him really, but now he is the enemy of the pro Duterte group. How is that possible?
Taylor Owen: Him and Bill Gates, I'm sure.
Maria Ressa: Bill Gates as well, right, the usual suspects. The narratives that were set up as conspiracy theories for the West now have reached everywhere, right? So if they don't deal with this... I guess where I was going with this, is that I felt that our problems in the Philippines wouldn't be solved until Silicon Valley solves this problem. And I guess I hope you guys... And I think Canada has done a much better job. I hope that we can get Silicon Valley to make the decisions that will protect the integrity of facts. Oh my gosh. It's a long winded way of getting to. At one point, I had decided that, on Pareto principle, that if I expend 20% of my effort to just get some change in that ecosystem, at that level, at Silicon Valley level, then that would have 80% repercussion in my country. It all begins with what people believe, with their view of the world. And when racism is pounded to you, when hate is exponentially distributed, you've got to know that that will have impact, right? So I don't understand the scorched earth policy of Facebook in this, because ultimately in the medium and long-term, they will destroy democracy in their own country and they won't be able to operate.
David Skok: Maria, I want to bring it back to you, just because this is such a big moment in your life. When thinking about Duterte and Trump and the news of the last several weeks, which has been unbelievably distressing in the North American context, you've been living with that for quite a while in the Philippines. A friend of mine told me that the difference between Duterte and Trump is that Duterte actually cares what the global community thinks of him and Trump doesn't. And I'm curious how you feel about that statement, and if that in any way gives you comfort given the campaign that you've been on over the last few years.
Maria Ressa: Oh, what a tough question you asked. The difference between this, is that the United States, their institutions do still work. Our institutions crumbled in the first six months of the Duterte presidency, and that's what we Filipinos have had to face. And the erosion of our rights is now going to be codified, because there's an Anti-terror Bill that passed our House of Representatives in five days during the pandemic, just last week, and now it's waiting for President Duterte's signature. This Anti-Terror Law, once it becomes law, will allow the government to arrest without a warrant. It could be any critic of the government. If you're determined to be a terrorist, you can be arrested without a warrant, you can be arrested for 24 days. Search warrants are no longer necessary. A lot of the kinds of safeguards of the Constitution have been ripped out. I don't see that happening in the United States, but maybe I'm wrong, right? Our managing editor wrote a piece that said, "Where the Philippines goes, America follows." And that's kind of what's happened in the dystopian future. Because I remember in December of 2016, being at Google News Lab and telling them about these exponential threats and how this will come for every journalist, and it did. I'm going around the Trump and Duterte. I think they have both been compared to each other. But the difference is that Duterte, while he is now the most powerful ruler, President of the Philippines, doesn't have the global power of Trump, and this is what's dangerous for the world. While the institutions may hold up, the continuing influence operations, the information operations that has splintered American society requires Americans to really step up to the plate, and I think requires action from Facebook. You look at Twitter and Facebook. Twitter, smaller, is taking action and is taking heat from Trump, right? Where is Facebook? Look, Mark Zuckerberg's statements about not fact-checking political ads is dangerous. It's truly dangerous because essentially, they could pay to spread a lie, which I guess that's what's been happening globally anyway. That leaves all of us vulnerable. And again, it's like the user is the target, and I don't know how we can deal with this.
David Skok: Know that on Monday, the world will be watching. And those of us who have chosen to dedicate our lives to the same pursuit of truth and facts will be watching intently.
Maria Ressa: Thank you.
David Skok: And I guess if you... Is there anything you want to say that we haven't asked you, and words of wisdom from your vantage point?
Maria Ressa: I don't know if it's words of wisdom. The reason I was, kind of, trying to grapple with that question, Trump and Duterte, you have to understand, I'm a citizen of the Philippines, Duterte, I'm a citizen of the United States, I'm a dual citizen, right? Both these men carry tremendous power over my life right now. President Duterte, in attacking Rappler, triggered a slew... When he attacked us in his State of the Nation Address in July 2017, a week later, I got my first subpoena, right? So this is a man who has tremendous power over my life right now, and I feel like I have no power except to shine the light. So thank you for telling our story. But the other place you would normally look in the past for help in press freedom violations or human rights violations is the United States, and now I don't know what's happening. How can the United States stand up for the values it used and says it still has, because I think it is there. I even feel myself monitoring what I say about President Trump, and I shouldn't be doing that. I think that's the problem, right? Lesson learned, here's my biggest lesson learned. When there's a Damocles sword hanging over your head, you ignore it because otherwise it accomplishes its task.
David Skok: Wow. Thank you for taking the time. I don't know what to say, honestly, other than we will be with you in support next week. And thank you for inspiring a generation of journalists.
Maria Ressa: Thank you for having me.
Taylor Owen: On Monday, June 15th, Maria was found guilty of cyber-libel. She is out on bail and plans to appeal the conviction.
Taylor Owen: Big Tech is presented by the Centre for International Governance Innovation and The Logic, and produced by Antica Productions.
David Skok: Make sure you subscribe to Big Tech on Apple Podcasts, Spotify or wherever you get your podcasts. We release new episodes on Thursdays every other week.