Katherine Maher On Tools for Combating Disinformation

Season 1 Episode 4
Katherine Maher on Tools for Combating Disinformation

S1E4 / December 19, 2019

3-Ep4_KatherineM_Headshot-1200x1200.png

Truth and facts are not the same thing. To find truth, we must combine the best available information (what we know to be facts) with our own lens. This is especially challenging on social media: posts may be presented as if they are truthful, but truth comes with some subjectivity. This is where fact checking tools — like the encyclopedic knowledge of Wikipedia — are essential. That is, if they present neutral information, offer unbiased results, have a transparent process and are able to be edited.

In this episode of Big Tech, co-hosts David Skok and Taylor Owen speak with Katherine Maher, the Wikimedia Foundation’s chief executive officer and executive director. Maher joined Wikimedia in 2016 just prior to the election of President Trump. She is acutely aware of Wikipedia’s emerging role as a fact checking tool.

Transparent revision history is a cornerstone of Wikipedia. Anyone using the tool can access the edit log and provide their own edits to correct information. Some technologies subvert this process. For example, connected devices such as smart speakers pull Wikipedia’s information to provide answers to users without allowing feedback or sharing a record of edits. Other factors influence Wikipedia’s transparency and neutrality, as well. Maher’s team is looking to address biases that exist within Wikipedia’s database; she acknowledges that there is a gender imbalance, both with Wikipedia editors (80 percent of edits are made by men) and with the low percentage of articles about women and minorities. Maher explains: “we know, for example, that an article about a woman is four times more likely to mention her marital status than an article about a man. If you're doing [AI] training of semantic pairing and you start to associate someone's marital status with their gender, in so far as there's a sort of higher correlation of value to someone being married or divorced, being a woman that propagates that bias out into all of the products that are then going to go ahead and use that algorithm or that dataset in the future. Wikipedia will need to work to solve these issues if it wishes to remain a trusted source for facts.

Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

 

Katherine Maher: Wikipedia is not meant to be truth. What Wikipedia focuses on is what do we know that is accurate, and what do we know that is verifiable, and how do we present that in a way that allows a reader to come to an informed conclusion about what's going on in the world? If there's bias that exists in Wikipedia, it's being propagated out with every time Wikipedia is used as a training database, and I don't think that's a future that any of us want.

[MUSIC]

David Skok: You're listening to Big Tech, a podcast about the emerging technologies that are reshaping democracy, the economy and society. I'm David Skok, and this is my cohost, Taylor Owen.

Taylor Owen: Hi, welcome to the show.

David Skok: Taylor, in this episode we are talking to Katherine Maher, the Chief Executive Officer and Executive Director of the Wikimedia Foundation.

Taylor Owen: Katherine is someone I've known for a long time. We used to work together at a previous moment when the internet was going through a real change, when activists around the world were using social media, particularly in the Arab world where Katherine was working very closely around activists using technology to upend autocratic regimes, to democratize their countries, but now she's somewhere very different. She's the head of the Wikimedia Foundation, which I think is also again, playing a real pivotal role in a transition that the internet is going through.

David Skok: Yeah, and that transition is quite complex with the fake news and facts, extrapolate on that a little bit for us, how is the internet changing, and what is Wikimedia's role in it today?

Taylor Owen: Yeah. I think at the moment, we're at this time when we don't have great signals for reliable information on the internet. It's very difficult as a user of the internet, as someone who searches for content and information or consumes news to know whether what we are getting is reliable, and whether it's factual.

David Skok: So, more and more platforms are looking to Wikipedia to be that place that serves up answers. For example, when you ask your smart speaker a question like, why is the sky blue?

Taylor Owen: Yeah, that's it. And that's why this gets even more complex, is it's not just people using the internet, and people and users needing reliable information, it's the platforms themselves. So, if you ask a question like that to your smart speaker, how does it know the answer? It can't just go and query all the content on the internet, because we know a lot of it isn't factual, so that's where Wikipedia comes in. It is essentially a database for these platforms to feed reliable information into them, so when you ask, why is the sky blue to your Alexa, Alexa can again then draw on the reliable information that sits within Wikipedia. So, it's becoming a very important layer in the way the internet works.

David Skok: Fascinating. Let's get right to it. Katherine Maher is joining us from Wikimedia's office in San Francisco. Hi Katherine.

Katherine Maher: Hey, how's it gone?

Taylor Owen: It's going great. We're really glad to have you on the show. Before we talk about your work at the Wikimedia Foundation that you're doing now, I want to go back a bit into your professional history, because you've really been working broadly in the space of the internet and democracy for a long time, and you've seen and worked in a real evolution in this conversation over the past decade. I want to start there, because I think taking a minute to reflect on that time, on really that moment of optimism over a decade ago. What was it about the promise of digital communications that inspired you to work in this space at that time?

Katherine Maher: Yeah. I got involved in this space of technology for development, human rights, democracy, back in 2007 when I was working for UNICEF. The conversation that we were having at the time was really around the leapfrog revolution, which is, if today's conversation is all about the next billion coming online, this was all about how these new technologies were going to overcome some of the structural barriers of existing tech and how it connected people. So not just questions of internet connectivity, but questions of landline connectivity, questions of roads and access. We were looking at the rise of mobile networks entering into places that had previously not had connections, and recognizing as well that the next step forward there was going to be the rise of data enabled services and trying to anticipate what that was going to mean for bringing people closer, and touch to access information and access to services. And at UNICEF and other organizations I went on to work for, we were really interested in how do you use these tools to not just think about how to provide services to people, but how do you engage them in a conversation about what their needs are, how do you improve this quality of services they have access to their voice in thinking about the governance of those resources that would be provided. And so, I moved from I think the UNICEF world to working into the democracy and governance world, predicated on the idea that this technology was going to be a great connector and a great enabler. I was thinking, Taylor, as you asked the question about sort of decade ago, that we're sitting here having this conversation now amidst another shutdown of the internet in Iran, about 10 years after 2009 internet shutdown in Iran, that really I think, put on the map in a very meaningful way, the whole question of technology, not just as an enabler, but technology as a tool that could also really be you turned against a population and used to repress, or to surveil, or to otherwise impede citizen voice and participation. So yeah, we came full circle, we were the techno optimists, and now we are the techno pessimists, and I think now we've just accepted that technology is as part of everything we do in the same way pencils are technology, and radio is technology, it is what it is, and that raises all sorts of other questions about how it's built and what its intentions are.

David Skok: I think to Tahrir Square and the Arab Spring as Taylor mentioned, the promise of those few months and then the complete eradication of that promise. And Taylor and I have had conversations with others about the promise of open data, and the common space, and how so much of that has been closed off to people. How have you, personally, come to terms with that, and how do you maintain your optimism about what this space can be?

Katherine Maher: He says I'm an optimist. I mean, I think back to those days too. I do just want to acknowledge that the promise for Tahrir Square may still yet be in the offing. When we look at things from the perspective of a decade, we're being pretty historical as to how social and political change occurs over the course of human history. It's not generally one big bloody revolution, and everything changes over the course of there year, really meaningful structural change is intergenerational. I remain in the camp of thinking that maybe Tahrir was part of a series of seismic shifts that we may see continue over the course of my lifetime. I also just want to acknowledge that Tunisia is doing relatively well. But having said all of that, I don't necessarily think of myself as an optimist at this point. I think of myself as inspired by what Wikimedia can model for us as a different way of technology bringing people together, as a different way of thinking about platform governance, as a different way of thinking about fulfilling the promise of open technology on behalf of the public interest. But that stands apart today from the actual landscape that we look at, in which many of the things that we worried about coming to pass have come to pass. I think the optimism can be useful in terms of getting out of bed in the morning and continuing to make efforts in order to see and achieve the type of world that we want to be a part of, but I think optimism can also blind us to the difficulty of the work that we need to get done.

Taylor Owen: Was one of the blind spots almost that there was too much focus on this citizens' empowerment versus state repression aspect of the challenge, and less on the tech infrastructure that was being built on top of that? I mean, I feel like a lot of this discourse ended up missing the rise of platforms for example, because it was so focused on state surveillance and repression, versus citizens empowerment and rights.

Katherine Maher: And it assumed the platforms are on the side of the people, right? So, if you think back to a decade or so ago, the assumption was that tech platforms were inherently good, they were not going to do any evil, they were very much aligned with this increased freedoms and opportunity. And for quite a while they were, I think that was a naive assumption, but it was also an assumption that was predicated on the information we had available to us. You had companies pushing back for the most part against government censorship, against warrantless data collection, you had companies that hadn't yet had to really contend with market forces that were standing up for their users rights, and standing up for the importance of encryption and all of the good things that we, I think in civil society, we're very pleased to see and align with. So, I'm not sure so much that it missed the platforms as a conversation so much as didn't anticipate the ways in which the platforms would ultimately end up contending with questions of jurisdictional power, authoritarian rule, all of the market forces, all of the things that have really turned the tech companies from the startups they were, into the conglomerates that they are, or the tech empires that they are today.

David Skok: You went to Wikimedia, why is this the place to work on these issues that you've cared about for 20 years?

Katherine Maher: I'm just laughing, I'm like, 20 years, it's been that long? Oh, my goodness. The reason is really that Wikimedia is just such a remarkably unique organization at this point in time. The Wikimedia Foundation is the organization that's the home to Wikipedia. We've been around for about 20 years now, and much of the values that were embedded in our creation are still the ones that animate all of our work, so a commitment to open source, a commitment to open culture and open licensing in our content, a commitment to community governance. The Wikimedia Foundation works directly with the volunteer editing community to make determinations around how to assign resources, how to build strategy, how to think about what we should be focused on in the world, questions of what our policy platform should be, and what kind of advocacy voice we might want to have. These are really deeply embedded relationship there that allows us, some people will say, instead of being a platform with a community, we're a community with a platform. I think that that's just an inversion of the way that most other large tech platforms operate and see themselves, and in that sense, there's both an accountability to the public about what it is that we do, and an accountability to the community that this is something that we really build together. I think the other thing that is so unique is that while those values that I just spoke of, this openness, independence, freedom, they often these days get characterized as internet libertarian values that didn't accommodate or anticipate the very real challenges that we have come to see on the internet and questions of, how do we handle freedom of expression, how do we address issues of doxxing and harassment? I'm so heartened to be a part of, is the community has not only made an effort to hold true to those values, but also continued to integrate its identity and the work that it does to incorporate questions of, how do we become more pluralistic, how do we ensure that the community is more diverse, how do we really think about what does it mean to have spaces in which everyone can participate, in which various voices, marginalized or minority voices get to be part of the conversation. It's an interesting mix of an organization that comes from a place that I think brought a lot of assumptions that have since in some ways, proven to not really be sufficient to take an organization or a platform forward, with an organization or community that's really tried to continue to evolve in order to keep pace with the nature of the world and perhaps a more inclusive understanding of its power.

Taylor Owen: I'm struck by that in that, when it was founded, it was critiqued for being unreliable as a source of truth. Right? And now we're in this moment where ... I mean, you've said that you are concerned truth might become fractured, I think that is part of the bigger epistemic challenge we're facing right now, of how we get reliable information in our digital public sphere. Wikipedia has actually become a backbone of reliable information in it now.

Katherine Maher: Yeah.

Taylor Owen: I wonder if you could reflect on that a little bit, that arc of going from this new reform of knowledge production as it was, to something now that is providing reliable information into the digital ecosystem?

Katherine Maher: Yeah. This is the sort of ... Your teachers tell you not to trust Wikipedia, and we actually would say, that's probably right. You shouldn't trust anything you read on the internet without being really thoughtful about what it is that you're reading. You also probably should check to see what corrections your local newspaper has run since the last edition. People get things wrong all the time, our knowledge continues to evolve and change, and we want readers of Wikipedia to be critical readers. That's the promise of what Wikipedia offers, is that we try to be very transparent about where the information comes from, that's why you have citations at the bottom of every page. We have a commitment to making sure that the content on Wikipedia can be edited by anyone, so if there's something inaccurate, any reader has the ability to go in and improve the accuracy, or find an additional source, or flash out a different perspective, that is the promise of what we're trying to do, is this continued evolution towards greater accuracy, greater integrity of information. You can see every aspect of the edit history, and all the code is open source. It's really designed from the ground up to be an open system that people can participate in. What I think is very different about that is, if we start from the position of not being trusted and having to continuously earn the trust of the general public, that puts us in a very different dynamic relative to this effort at self improvement in terms of a body of content, than perhaps you see in other sources, where if you're a media organization, maybe you assume you have trust and you wonder why you don't have it. The other thing that I would say is, we try to be really careful not to use the language of truth. Wikipedia is not meant to be truth, it's simply what we can agree upon as general consensus about an issue at any point in time. Truth is all about the lens that we bring to a topic area, it's not just facts, it's facts plus context. What Wikipedia focuses on is not the ultimate truth of any situation, what Wikipedia focuses on is what do we know that is accurate, and what do we know that is verifiable, and how do we present that in a way that allows a reader to come to an informed conclusion about what is going on in the world or what's important to them.

David Skok: It's funny hearing you talk about these things. I'm a journalist, I've been an editor for 20 years, so I'm familiar with that 20-year-mark as well. I'm hearing you talk about these things, accuracy, verification, truth, these are all the things that I hold near to dear, and that all journalists hold near and dear. How do you think the Wikimedia world differs from what most would consider traditional journalism?

Katherine Maher: I think there's a lot of ways in which we differ, and there's some ways in which we try very hard to adhere to some similar principles. Ways in which we differ is that we're an omnibus reference, right? So, we have information that is not inherently journalistic in nature, in the sense that it's general reference information, what's the chemical composition of water? That's a pretty important part of the offering of Wikimedia. We also are not seeking to consistently provide the newest information about a particular topic, or unearth new interpretations, or new understandings of an issue. What we try to do is provide a longitudinal reference point to say, this is the general context overview of what is interesting about this topic, and here's how it might've been considered at one point in time, or here's the history of it, here's how people think about it today. It's a complimentary product to traditional journalism, it also is something that relies really heavily on certain aspects of traditional journalism, both in the sense of the process of verification. So, all Wikipedia articles need to have references to what we call reliable sources, that is very often a media outlet of some source, and the way that Wikipedia editors consider reliable sources is they have to have a process of fact checking, editorial review, peer review. And so, we really act as a layer on top of an overall media ecosystem. And to that end, to think it's incredibly important that that media ecosystem is healthy and has vibrant business models, and has the trust of the public.

Taylor Owen: Yeah, it's interesting. I mean, I wonder if whereas journalism's had a bit of a monopoly on being that layer of accurate information that society depended on, whether it was institutions or the public or the private sector, whatever it might be, and now you have this alternate layer of accurate information or of processes to create accurate information in the digital space. And because of that, it seems to me that the platforms or the tech platforms and technology companies in general are relying on you. Right? More and more, like they're using your information, is that becoming a challenge, and are the companies that are leveraging your information and process contributing to the production of the knowledge that you're doing?

Katherine Maher: Yeah, that's a great question. I think that what we really view ourselves on, Wikipedia is a tertiary source based on secondary sources, so those secondary sources as I mentioned, are very often the news media, but it's often journalistic research and other areas of specialization, so that's where I think it complements it. We are really a front door into good reporting, I think would be one way of thinking about it. Providing that summary of the work of many institutions or organizations who are going out and doing that shoe leather type of work. The way that that has now been picked up, because we compete is all published with an open license, that's a creative commons license. It means that anybody can reuse it as long as they attribute it back to Wikipedia. The tech companies which are increasingly interested in being able to solve all of their users' problems in one click or what Google searches refer to a zero click, which is all about providing information at the first point of search to minimize the friction that somebody has to go through in order to get their question answered, and that search results, that voice assistance, Wikipedia is a very big part of that ecosystem because of the fact that it generally is considered has a high trust index relative to users. People trust it, and they rely on, has incredible breadth in terms of the information that it provides, so you can use it to look up questions of current affairs or 16th century Islamic art or political systems, pop stars. And so, it really has become a very big part of this offering of all of these platforms, not just in the immediate content, which is like, hey Siri, can you tell me the birth date of Bibi Netanyahu, but also in training a lot of the underlying aspects of the platforms themselves. So Wikipedia, because it is this really large corpus of open information and open data and Wiki data as well, has become a pretty integral part of the natural language processing ecosystem, which is how do you train computers to parse human ways of putting information forward, and how do you then train computers to spit back out information in a way that that seems natural to a human, in the way that we read and write? So, we've gone from being this article, encyclopedia for the internet, to a pretty critical part of the infrastructure of the 21st century when it comes to the way that we hold, maintain and distribute knowledge.

David Skok: Katherine, just to pin you down on this, if I may, when you're doing a fundraising drive, do the tech platforms pay up? Are you happy with the amount of money they're giving you? Should they give you more?

Katherine Maher: No, I was going to get to that. I didn't mean to evade the question. The answer is that it's a mixed bag. From our perspective, the amount of value that the Commons, and that's not just Wikipedia, but the Commons in general creates in the world is really significant from a commercial and non-commercial standpoint. I don't think that the Commons is compensated relative to the value that it creates. I think that if we were operating Wikipedia or any other number of products, Commons based organizations were operating as for-profit entities, they would be negotiating very different contractual relationships. But that's not our goal, we're not here to maximize profit, that's not what we want. What we do want is a Commons that is sustainable over the long term and a Commons that serves the greatest number of people. So for us, sustainability is partially monetary, it's about making sure that we have the resources to continue to invest in growing the Commons to ensure that our technology can continue to evolve in order to meet changing interface and platform needs, but it's also to think about, what is the Commons that doesn't exist yet? So most of Wikipedia as exists today, most of the Commons in general, is largely reflective of the Global North, is also largely reflective of a certain subset of demographics. It's largely white, it's largely male, that's the history of the written word and the written world. What Wikimedia is very committed to, and I'm going to promise I'm going to connect this back, is how do we actually grow a Commons that's a value to everyone? And so, when we think about longterm sustainability, it's what are the resources necessary to meet the needs, the knowledge needs and support the knowledge needs of people who aren't currently either on the web, that next billion, or who don't have access, or don't see themselves reflected cause their languages aren't reflected and the like? That requires a degree of investment that is not currently supported by that population, and those folks aren't donors to Wikipedia right now. But it doesn't mean that it's not any less important that we actually go ahead and serve them. So, this, from a sustainability standpoint, it's financial, but it's also ensuring that as platforms go out and use Wikipedia content, they're not getting in between Wikipedia and the user, and the person on the other end. What I mean by this is, if Wikipedia's model is all about constant editing, and constant updating, and the introduction of new material and new topics, we need to have a way in which people can do that directly into the platform, directly into the database. The more in ways in which Wikimedia is intermediated by voice assistant or a search result, the harder it is to ensure the health and long-term sustainability of the generation of content, and it's relevance temporality and the like. So, the conversations we have with the platforms are ... yeah, they're monetary, but we try to make sure that they're monetary, not just on our behalf, but on behalf of the sustainability of the Commons overall, and then they're also really thinking about, what is the interface of the experience that someone has here so they can participate in knowledge, not just consume it, and that they feel like they have the ability to have an active voice and monitor for the integrity of the knowledge itself?

David Skok: You mentioned some of the challenges that Wikipedia has had with gender representation in its community. What have been the implications of this, and how are you dealing with it? And I'm curious more broadly, in what you think other tech companies can learn from some of the work that you've done in this area.

Katherine Maher: Interestingly enough, Wikipedia as a platform is actually not doing as well as other tech companies in terms of the ratios of participation, gender participation. Our best numbers indicate that about 80% of our contributors are male, and about 20% of our contributors are female. Other platforms are far closer to parity between men and women in terms of participation, and I'm talking platforms like Pinterest, or Facebook, where you have a wide general purpose audience. But what that means for us when it comes to the participation of editors or content creators on Wikipedia, is that we often are missing things that we would want to see it reflected. So, similar to the numbers around editor participation, just under 20% of all of English Wikipedia's biographies are about women, about 80% of the biographies are about men. That is not necessarily to draw the inference that men only write about men and women only write about women, there's many other factors at play there, including original source material about women of historic note. There's a lot of biases. You go back into the cannon, about who got written about and who didn't, and it's still true today. Women are underrepresented in positions of power and the coverage of their accomplishments and roles. That is what drives some of these gaps, and they're not just about women, that's also about people of colour, it's about marginalized communities, indigenous communities. A lot of the knowledge and representation that we'd want to see on Wikipedia isn't quite there. We, at the Wikimedia Foundation, view this as a problem as do the folks in our community, because when it comes to, how do you serve all the world's knowledge, really we have to make sure it reflects all the world's people, otherwise it's a hard argument as to why everyone should want to use it if they don't see their reflected there. And so, we've worked very closely with the volunteer editor community to support and resource efforts that they make in order to address some of these gender imbalance. There's a number of initiatives around and bringing more women into editing Wikipedia, but also about ensuring that there's greater diversity reflected in the content on Wikipedia. And then, we're also looking at, what are the changes that we might need to make in the product experience, and so that's the interface of how you edit Wikipedia to create more on-ramps for people who would like to participate but don't necessarily know how. And that's not just women, that's lots of different folks in general. Representation matters in the sense of, if I go to look for something about an African American woman scientist, and I don't see that individual is reflected, it means that I don't have necessarily a model for how African American woman can become a top researcher at NASA. And so, that representation I think, is fundamentally critical. I think that's widely accepted. What I think also really matters, Wikipedia, given how prominent a role it plays in the public discourse, if Wikipedia doesn't have it, it also seems as though it is making a value judgment about what is important in the world. That's not at the values of our organization, but I think the last and most perhaps important thing for us is that, given what I was speaking about earlier, the role of Wikipedia in informing computational science, if there's bias that exists in Wikipedia, it's being propagated out with every time Wikipedia is used as a training database. So we know, for example, that an article about a woman is four times more likely to mention her marital status than an article about a man. If you're doing training of semantic pairing, and you start to associate someone's marital status with their gender, in so far there's a higher correlation of value to someone being married, or divorced being a woman, that propagates that bias out into all of the products that are then going to go ahead and use that algorithm or that dataset in the future. But unlike Wikipedia, which you can go in and fix and edit and rebalance and work towards neutrality and continuous improvement, those become blind to our system, and we end up living in a world that is fused with product bias. And I don't think that's a future that any of us want.

Taylor Owen: Yeah. It's the fact that there is this process that's perceived as being neutral and creating a neutral knowledge via this process that Wikipedia enables that could lead, arguably, to the biases of the contributors being further entrenched.

Katherine Maher: Yeah, that's right. And so, one of the things that I think our research team is very focused on internally within our organization is building what we think of as ethical AI or ethical machine learning that can be a model for how, not only do you build tools and services that are open, but you also think about, how do you build in auditing mechanisms and accountability mechanisms, how do you publish the datasets, how do you make sure that there's awareness of how these models are being built, trained, and then how do we make sure that when we are able to identify biases or problems that we've gotten mechanisms for actually actively retraining them? This is a big part of the conversation that we're having is really a, we want researchers to be aware of the need to correct for bias if they're using the Wikipedia training data sets, but also we're trying to build systems themselves that model the kind of behaviour that we think the broader tech community should be engaged within adopting.

Taylor Owen: Let's step back a little bit away from Wikipedia. One of the debates that I've been watching closely over the last little while and participating in around talk around content moderation and the governance of platforms has been what seems to be this real divide emerging in the digital rights community between those that want to prioritize free speech on these systems, and those that want to prioritize protection from harmful speech. I'm very curious what you think of it, because you know that community so well and are fundamentally a part of it, and I'm sure you know people and the organizations on both sides of that conversation right now. I wonder what you think of that.

Katherine Maher: Yeah. So, you're asking this question of, should platforms be required to moderate hateful speech, for example? The moment you walk into this conversation, it becomes so much more multifaceted than I think the soundbites that we see from politicians. So, the primary question tends to be around this really simple binary is like, should Twitter or Facebook allow paid speech on their platforms? Well, Twitter and Facebook are not government entities in the United States, they are private platforms. Do we want private platforms in a role of adjudicating speech decisions? They are not accountable to the public, there's no mechanism by which to appeal the president, a president that's been set, particularly for a general purpose platform that is in fact meant to be a platform for speech that can be really problematic. And yet at the same time, these companies have the ability in the right to set the terms of reference or the terms of use. So, a product like Wikipedia, we have a series of policies around what kind of content is appropriate for users to post on Wikipedia, and what's out of scope, because we're a purpose platform. Our purpose is to build an encyclopedia, it's not meant to be a platform for political organizing, and that's our constitutional right as an organization founded in the United States and subject to US jurisdiction. I would hate to see a platforms like ourselves either mandated to allow all speech, which in and of itself, would be, I think a really significant problem relative to our mission to provide accurate information. You don't want an article about how the earth is flat on Wikipedia, and at the same time, we also don't want it to be in a place where we're adjudicating, which should be a matter for the courts, and a matter for public institutions around, what determines harmful speech, and what recourse individuals have if they're blocked or banned. So from my perspective in general, the idea of creating obligations on platforms to provide this private policing role is fundamentally problematic when it comes to the operation of a society that is based in the idea of public institutions and public recourse. Now, having said that, I also think that the question ... and then there's this whole other question about provisions around how we evaluate hate speech, the ways in which the First Amendment in the United States handles this is somewhat different than the way that freedom of expression standards handles this question outside of the United States in Europe, in Canada, elsewhere, questions around, is hate speech is merely hateful or is this also about incitement of violence, all sorts of things that go into being able to make these determinations, which is yet again another reason why we want clear, transparent processes that are public in nature, not privatized in nature. But at the same time, I think we recognize the need for platforms, if platforms are going to have aspects of their terms of use or terms of service that say harassment is not acceptable, then they need to enforce those terms of use and terms of service. And they need to be able to do so in a way that is clear and consistent across those platforms, and also offers users ways to appeal some of these content moderation decisions or at least get answers as to how those decisions are being made. So, I think that it is not as simple a question as to prioritize enforcement, I do think that it needs to have platforms meet in the middle around what those terms of use are, and how they're actually applied.

David Skok: Katherine, you took over in June 2016, and obviously a few months later, the new president was elected, and subsequently a lot of questions about truth and facts came out and really put a lot of the issues that you've been wrestling with at the centre of the public debate, and continue to be to this day. It's been three years, I'm curious if you have any reflections about your time so far, and whether you expected this to be such a dominant issue in the discourse. And moving forward with another election coming up next year, what is your sense about your role and Wikimedia's role will be in the next year and a half?

Katherine Maher: Yeah. When I took this job, I thought it would be interesting. I didn't realize how relevant it would be. These questions of the tech lash, these questions of misinformation, and these questions of what is truth, it's certainly been a really interesting couple of years for that conversation. What I'll say is, the election coming up next year is an election in the United States, but that's not the only place in which Wikimedia operates. Our Spanish community just went through this with elections in Spain, the German community has just gone through this with elections in Germany, the same thing in India, not long back. And so, what we've really seen is that one of the things that Wikipedia has had to do over the course of the past 18 years of its existence, is develop mechanisms by which we are managing for efforts to manipulate the platform, managing for efforts to shift the discourse around the body politic, managing for ways politicians, individual campaigns, parties, et cetera, try to shift narratives to appear more flattering, or their opponents' narratives to appear less flattering. I think that we, we try to avoid being myopic on what the US is doing in terms of the elections. We just had a major Canadian election, and that affects English and French Wikipedia too. We are looking into the next year as we are hearing from people who use Wikimedia as a project, whether that's private individuals, or donors, or tech companies, is an increased appreciation for the role that we play in a world in which there's quite a lot of distrust of other sources of information, and I think an appeal to us to ensure that Wikipedia is and remains this platform that is neutral and that continues to sustain trust in the public. I think the other part of what we're looking at is, how do we ensure that we're providing our communities with the support that they need to continue to resist the efforts of bad actors to try to manipulate information? We recognize that as much as anything, with the increased trust that Wikipedia has, the increased prominence, it will also have as a vector or as a target for efforts to manipulate the public perception, and so it's up to us and our communities to really ensure that they're well positioned to be able to identify these efforts, pin, push back against these efforts. I think we have quite a number of systems in place that we feel relatively good about, but we never make the assumption that we're ultimately safe. Every other platform are constantly alert to the potential real harms, and take very seriously our role in the public trust.

David Skok: It's really become apparent to many of us just how global these issues are, and the key role that the Wikimedia Foundation plays on a global stage in conveying a sense of stability through these turbulent times. Katherine, we're grateful that you've taken the time out of your busy day to talk to us, and we hope to speak with you again soon.

Katherine Maher: Thank you so much for the opportunity.

David Skok: That was Katherine Maher, the Chief Executive Officer and Executive Director of the Wikimedia Foundation. Katherine joined us from San Francisco.

David Skok: To our listeners, Let us know what you thought about this episode. Use the #BigTechPodcast on Twitter.

Taylor Owen: I'm Taylor Owen, CIGI senior fellow and professor at the Max Bell School of Public Policy at McGill.

David Skok: And I'm David Skok, editor-in-chief of The Logic. Thanks for listening.

[MUSIC]

Narrator: The Big Tech Podcast is a partnership between the Centre for International Governance Innovation, CIGI, and The Logic. CIGI is a Canadian nonpartisan think tank focused on international governance, economy and law. The Logic is an award-winning digital publication reporting on the innovation economy. Big Tech is produced and edited by Trevor Hunsberger, and Kate Rowswell is our story producer. Visit www.bigtechpodcast.com for more information about the show.

For media inquiries, usage rights or other questions please contact CIGI.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

Opinion

Read