Kate Klonick On Facebook’s Oversight Board

Season 1 Episode 2
Kate Klonick on Facebook’s Oversight Board

S1E2 / November 21, 2019

3 Ep2_KateK_Headshot.png

Facebook is establishing a 40-person oversight board to pass rulings on whether or not content should remain on their platform. The board aims to represent all regions of the world, rulings are set to be released in multiple languages and decisions about content to be made expeditiously. Only one researcher, Kate Klonick, was invited in to observe the process that went into establishing the framework for this oversight board.

In this episode of Big Tech co-hosts David Skok and Taylor Owen speak with Kate Klonick, an assistant professor of law at St. John's University law school, and an affiliate fellow at Yale law school about what she witnessed in this process. Klonick was embedded in Melno Park without a non-disclosure agreement, given full access to meetings and was able to record all the conversations and workshops. Throughout the process, she maintained her academic immunity, not accepting anything from Facebook, not even a free hotel room.

Klonick discusses the struggles faced by the team tasked with building the oversight board. At the beginning, it didn’t look like the project could be a success: “My lens is obviously from a legal perspective, and it's a little bit like when you're a hammer, everything's a nail. I look at a lot of the problems that I was seeing as they were creating this oversight board, and it was comparative constitutionalism, it was administrative law, it was democratic legitimacy.” Facebook continued to work on the oversight board, and as Kate admits, they did solve many of huge constitutional problems presented by content moderation. But she is still skeptical about how this board will scale, whether it will be overrun with appeals and how will the public will perceive its effectiveness.

Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

 

Kate Klonick: I was giving a talk to some journalists off the record about the board, and someone just exclaimed, "Oh, my God! This is just a global censorship regime." And I was like, "What did you think Facebook was doing before the oversight board?"

[MUSIC]

David Skok: Welcome to Big Tech, a podcast about the impact of technology on our economy, democracy, and society. I'm David Skok.

Taylor Owen: And I'm Taylor Owen.

David Skok: Today, we're discussing Facebook's new oversight board.

Taylor Owen: Facebook is going through this pretty remarkable experiment right now. In response to this challenge they have, which is how do you moderate and decide what's allowed to be said on a platform where there's a billion pieces of content posted a day, they have developed this idea of creating almost a supreme court-like body, that will sit a bit distant from the company itself, and will make decisions on what is allowed to be said, what gets taken down, and where the limits of speech on the platform are.

David Skok: And this has broader implications, not just for Facebook, but for other social media platforms, for political advertising across the world during elections, and for international laws.

Taylor Owen: Absolutely. I mean, just look at the news in the past few weeks, with Facebook deciding to allow false information in political advertising. Twitter just recently banning political advertising on their platform entirely. All the big global platforms are struggling with this problem. How do you get your head around what is allowed to be said on platforms that are operating at this kind of scale around the world?

David Skok: To help us understand the Facebook oversight board, we spoke earlier this month to Kate Klonick, an assistant professor of law at St. John's University law school, and an affiliate fellow at Yale law school.

Taylor Owen: Kate's a fascinating figure in this. She is a constitutional law scholar and for a long time, she's been arguing that Facebook is looking more and more like a state than a company. And one of the aspects of that from her perspective, is that they need to take on some of these judicial-like responsibilities that a state has. And so they invited her in to watch as they were building just that.

David Skok: So in our conversation, we dig into if it's even possible to moderate content at scale, what are the legalities of doing so, and importantly, what's Mark Zuckerberg really like?

[MUSIC]

David Skok: Hello Kate.

Kate Klonick: Hi, how are you?

David Skok: Thanks for coming on the show. So Facebook found you and asked you to conduct research on them, as they put together this oversight board. It's quite an extraordinary thing, and you're the only person that's been asked. Why did they ask you, and nobody else?

Kate Klonick: It wasn't quite so one-to-one. It wasn't them asking me. I was doggedly deciding to write about this before they came and said that they would allow me to embed at Menlo Park, and watch the team that was building out the oversight board. But I think that one of the reasons they asked me is there's simply not a lot of people who have a legal background and also a tech background, that are writing about this precise area.

And when I decided that I wanted to write something just on this, and just on the oversight board and how it was developed, and made the pitch to them that this was this constitutional moment, I think that they were interested in getting that story out there. I think that there's also a sense that I wasn't exactly a journalist, so I could have some type of empirical removal from what I was doing. I think that's why they asked me. But I tell everyone that I think that there should have been more people in the room. I think that there should have been a lot more people paying attention to this as it was happening.

David Skok: Could you actually take us back, just to set a scene? I know you've written about this in the New Yorker and elsewhere, but what was going on in the world at the time when you were asked to do this?

Kate Klonick: So I started being embedded at Facebook in late May, early June of 2019. And that was probably about five months in to their doing a global consultancy project about the board, that was looking all over the world, basically, and conducting thousands of interviews and workshops to try to figure out what to make the board look like. Those had been mostly in private, and there hadn't been a ton publicly released about them. I was trying to get more information about them, and ask people that I knew. And I think that basically the fact that I was continuing to ask questions, was one of the reasons that they let me in eventually.

The moment that they were getting ready to make hard choices on a lot of the things that they'd been hearing from people around the world about what they wanted the board to look like, and so that came in right before they wrapped up the consultancy period and started to make hard decisions.

Taylor Owen: Wasn't Hard Choices, the name of their blog?

Kate Klonick: Yeah. Maybe it's not Hard Choices, but it's something else. But yes-

Taylor Owen: Something like that, yeah.

Kate Klonick: Yeah. It tries to get at the idea that content moderation is this intractable problem, which obviously to some extent, it really is. But yeah, that's definitely something that also came out when they were doing a lot of the consultancy. It felt half like them trying to get information from global users, and also half them trying to explain to the world how hard their job was in moderating content at a global scale.

Taylor Owen: Before going into that, I'm fascinated in how you view yourself in this process. I like that you said that it was not quite a journalist, or something, right? And I'm curious how you viewed your methods here. Was this ethnography, or was this embedded qualitative research, or was it access journalism? It seems like it's some mix of all those things.

Kate Klonick: Yes, it kind of is. And I don't think that it's completely perfect. I shy away from ethnography, because that's a very specific type of training.

David Skok: Absolutely.

Kate Klonick: I would say that this is a little bit anthropological. I feel like I'm a little bit writing about a legal system that's getting off the ground as it's happening. My lens is obviously from a legal perspective, and it's a little bit like when you're a hammer, everything's a nail. I look at a lot of the problems that I was seeing as they were creating this oversight board, and it was comparative constitutionalism, it was administrative law, it was democratic legitimacy. All of these big picture thoughts.

But I guess what I see myself as, is qualitative journalism, I guess. And in terms of my methodology, there are a couple of really key things that I really try to emphasize, which is I took nothing from Facebook. Not a hotel room, not anything. I had independent grants that funded all of my research, and I was really lucky to be funded by Knight and MacArthur and the Charles Koch Institute. So those three grants came together to be able to support this.

And then I negotiated to have no NDA, and to be able to put all of my interviews and all of the shadowing I did, on tape. So it's all recorded and everything is there for posterity, hopefully, at some point.

Taylor Owen: Can you publish any of that recording? That must be off the record, right?

Kate Klonick: No. Nothing's off the record.

Taylor Owen: Wow.

Kate Klonick: That was the whole point. I knew that because I was the only person doing this, that there was going to be a lot of scrutiny. I think that everything I've written about Facebook has been largely neutral and very, very descriptive. I was just trying to do arbitrage between legal voices and tech voices, and try to bring them together so that they understand each other in a more constructive way. But people tend to always think that if you don't say something negative about a tech platform, or if you say nothing, then you're pro tech platform, which is definitely not how I feel.

Taylor Owen: Something I really... Not to go down the full research rabbit hole here, but that's something I think a lot of scholars studying this stuff really struggle with right now, is this role of outside critic, or inside collaborator, to a certain degree. How do you see yourself positioned in this discussion right now? Because I think a lot of academics are really struggling with where they sit.

Kate Klonick: It's funny, every once in a while I get a call from somebody that wants to hire me at a tech company. I'm like, "Absolutely not." There are a lot of legal academics right now, Taylor, that are taking a lot of money from these companies to do consulting work. And I jokingly said the other day to someone, I was like, "I'm just going to be poor and die writing about Facebook, because I'm just going to constantly be churning out this stuff, because this is endless stream of news cycle."

Taylor Owen: But you'll sleep well at night.

Kate Klonick: Yes, exactly, but at least I'll sleep well at night. So I mean, there's the money angle, but then I think that what you're getting at, which is maybe the embedded angle part of it, which is how do you not turn into Patty Hearst? All I can say, and this is not trying to be precious, but one of my favorite journalists is John McPhee, who used to be an environmental journalist for the New Yorker, and he wrote this amazing book called Encounters of the Archdruid.

It's this three part series that travels through all of these different landscapes with environmentalists, and I guess you would call them anti-environmentalists or developers, or whatever else. And I remember getting to the end of the book and just being like, "I have no idea how John McPhee thinks about this." That's what I go for with all of my pieces. I just don't want them to know how I think about this. I just want to deliver it, if that makes any sense.

David Skok: It does. So we try to do that every day as well. This is something I actually am surprised by. I find I can't walk past the lobby of a tech company without having to sign an NDA. How receptive were they to that?

Kate Klonick: I didn't get really a ton of pushback. They would scoop me up at the lobby before I had to sign the NDA that everyone signs as they walk into Menlo Park building in Facebook. And I wouldn't sign it. I think that there was something that they felt like what they were doing required that type of transparency, because that was the entire genesis of the project, was to try to build transparency and accountability. And so if they couldn't give it to us in the process of it, then they wasn't going to really mean much once everything was standing up on its own four legs.

David Skok: Because you're a professor of law, I have to imagine they were asking you questions, or wanting to pick your brain about things while you were there as well. How did it feel to be a part of that process with them? I know you said you were a neutral observer, so how were they integrating you into that overall?

Kate Klonick: I would ask a lot of questions, but they were more clarifying questions. And some of them were pointed, so some of them would be like, "Have you thought about the fact that this looks just like a civil law system?" Or judicial bodies in many states have ethics boards that review the conduct of members, and you could build those types of rules in. And I would definitely say things like that. But for the most part, I was removed.

They had so many people that were fancier than me, consulting them on how exactly to build this. People who have built constitutions before. People who have built nation states before. And so I didn't feel like that was really my role, or they were maybe a little interest in my observations, especially at the end, because I'd seen so much, and they were looking to be like, "Do you think that we did this right?"

And I would say that there was actually only one moment that I really threw down, and was like, "You have to do something, maybe a little bit more robust here." And that was actually around, and this is super dorky, we don't have to get into this, but it was around the removal clause of board members in the charter, because the MIT Media Lab scandal had just happened. And the impeachment had just happened. And I was like, "Listen, let me tell you that removal clauses matter. And so, it actually is between you and an autocratic government, is a removal clause." That was the one moment that I was like, "Okay, think about something more robust." They didn't end up actually having it much more robust, or as robust as I had pushed for it to be, but they did take it more seriously, I hope.

David Skok: How high up in the organization were you able to go? Did you talk to Mark Zuckerberg, Sheryl Sandberg? Did you have access to everybody on the senior team, or were you in a self-contained area?

Kate Klonick: The main team that I was following was a team that was composed of around ten people that were core to the team, and then about ten to fifteen other people that came in. And they were constantly hiring out while I was there. They called them cross-functions. So people that primarily worked in other teams and then did work on the board.

I do have access to Mark Zuckerberg, I have an interview with him coming up in the next couple weeks, that will be in anticipation of who the board members are going to actually be.

David Skok: In terms of the sales team, or the monetization team, or even the public relations team, I ask that question because I'm wondering, how much of this for them was giving you the freedom to look at the entire organization, versus wanting to control a little bit about the message and the optics, and the potential revenue hit that any of this could have on their bottom line.

Kate Klonick: There was no monetization team. I didn't talk to anyone that was doing that. I had one handler, one woman who was PR, for the entire team that handled all of it. She was the only person, and she didn't ever stop me from anything, or anything that I wanted to do. And in fact, basically worked to make sure that I could be in more meetings if I requested them, and give me more information if I needed more information, or wanted to dig more into one area or another.

But I wouldn't say that it was super managed. In fact, I talked to and continue to talk to people, and I tell them nothing. That was part of the deal, right? I told them that I would be unvarnished in my views, and be critical. And so the paper that's coming out, it's going to be 80 pages of boring law review article, but it's pretty critical. There's a lot of things that they did really well. But there are some real pitfalls that I think that they have coming at them, and some of the problems that I saw behind the scenes that could have fed into those. But overall, I thought it was a fascinating, really thorough project.

David Skok: So let's talk about the report that you have coming out soon. We'll get to the pitfalls for sure, but first, I guess just stepping back, if you could give us an insight now, what was your biggest takeaway from observing the process?

Kate Klonick: I would say that it was one of these things that I was a little bit shocked I was the only person in the room. It just seemed a very enormous project, and simultaneously you're like, "How big could it be if I'm the only one sitting here listening to this happen?" So I guess I flip back and forth between those two things. I was just like, well, we're in this constitutional moment for all of online speech, and all of platform governance. And this is a thing that's happening, and I have to ask all the right questions and do all of the right stuff. And try not to be too grandiose about it.

And then sometimes you just got to a level in which you're just like, wow, this is everything I ever learned in con law. Everything I ever learned in admin law. Anything I ever learned in governance theory, all coming together at once. And his is just such a, such a, such a hard problem. I think that that was the thing. It was so immense, the project. And then they just kept... Even while it was terrifyingly huge, they just kept going with it. And I think that when I started in June, I did not think it was going to work out. And then they really solved a lot of the huge problems by September. And that was amazing to me.

Taylor Owen: I feel like one of the things Facebook's been trying to do in the last six months or year, when they've been in the spotlight for a lot of this stuff, is just try and get across earnestly how difficult these tensions are. Right? And how difficult this problem of global speech moderation at scale actually is, right? And this process, and so you being a part of it I'm sure, was part of bringing visibility to that challenge. But also bringing some daylight to those tensions, right, and the trade-offs that are at the core of those problems. Did you see some of those tensions play out? And what are some examples of some of the things that they really struggled with? And I guess related to that, what is this thing they came up with? How did they construct this board that was going to be responsible for mitigating those tensions?

Kate Klonick: Yeah, so to your first question, I think one of the great examples is that after they finished the global consultancy phase, and that included 1,200 responses from their open website where you could tell them what you wanted the oversight board to be. That overwhelmingly people felt that diversity had to be a huge factor in the board. People just needed to have a diverse perspective. The board members actually had to be diverse. I think in January they had already decided that the board was going to be 40 people, right? And so how do you possibly, possibly pick a global board-

Taylor Owen: Why 40? Where does that number come from?

Kate Klonick: From my understanding, when I asked, it's from business law. That's about the maximal size for having a group that is also able to be functional. But it doesn't have to stay there forever. The charter allows the board members to bring in more board members so that they could eventually be larger. But that's how it's initially envisioned. And it'll start out with 11 board members, not 40. But that being said, how do you pick 40 people to represent the globe? That's crazy, right? And are they all like Kofi Annan, or are they Brad Pitt? God, I hope not. Also if I get to meet Brad Pitt, then that would be totally fine. But how do you pick these people? Are they just every day users? If you pick one person from Tennessee, that doesn't represent the entire United States. How do you possibly do that?

So here's this intractable problem, right? What does diversity mean? Everyone thinks diversity matters. What does diversity actually mean when the rubber hits the road? What they interestingly did was go to the next level question in my mind, which was that, what is this board? Why do we want it to be diverse? What is the point of it being diverse? And it's not a parliament, and so it doesn't need to represent people in this democratic way, that's not why we want it to be representative. We want it to be representative, for lack of a better comparison, but just because I'm an American, and so this resonates. We want it to be representative the way we want women and people of color on the Supreme Court. Not because they are elected representatives, but because we just want them to be representative of our community.

But what's more important is that they have qualifications to be neutral decision makers, that they can apply reasoning and law to a wide variety of facts. And so that's actually what ended up being the guiding structure between the selection of board members, was it became very much less based on people who are all experts on international human rights law, or people who are experts in content moderation. Or even people who are experts in technology or anything else, and became about, "Okay, are you a person that has enough professional training to be able to apply decisions consistently? And then on top of that, we will balance that with ideas of diversity." And so I thought that that was actually a pretty good solution to that problem.

Taylor Owen: I mean, this is something that I struggle with, with this. That it feels like one of the challenges that the Supreme Court idea is trying to solve or mitigate is this disconnect between needing to deploy solutions and product globally for the platform to function, right, at scale, and where that bumps up against regional or local specificity, right? That's where a lot of the challenges with content moderation fall, right, is that they had a single rule book for the whole world, and it turns out that didn't work very well in certain cultural contexts, right? Where they didn't have people working with that diversity, right, in their content moderation teams, for example.

Kate Klonick: Completely.

Taylor Owen: So in Canada, there's been a lot of conversation here about who will be the one Canadian on the board, right? And who is that person who represents this positionality of the country? And I don't think it really matters in the case of Canada, but it probably matters who the person in Myanmar is, right? If there's going to be someone from there. How do these 40 people actually solve that problem? Or is that a different problem? Am I wrong there?

Kate Klonick: No, no. No. You are completely correct. I think that is the problem. Let's start with the assumption that there's 40 people, because that's what it's supposed to be at when it's finally built out. And then among those there's going to be a sub-committee that selects cases as they come up through out of Facebook and then are presented to the board, and it's a writ kind of issue. They pick the cases they think that they want to hear. And then five-member panels are going to be hearing the cases, and I say hearing, but there's no oral arguments. It's all done in writing. And deliberations are not open, they are closed. But then the decisions are transparent, and with explanation and published.

But one of the things that people really pushed for, and was the hardest. And I remember they actually changed their mind about this, based on the feedback that they got from people, was that there had to be a local or regional member on every panel for the issue that was being deliberated. That's an enormous amount of responsibility, and so I think that what they ended up coming to a solution on was basically on the panel level, that there would be a very rough regional representative, that was a member of the board that was from the region that the speech or the content that was an issue was coming from.

The bigger question that I think you're getting at is that they are going to have a very large staff, which is going to basically do a lot of what they call internally at Facebook, market research, which is like the research of what's happening on the ground and culturally in any one of these areas. And so that team is basically a little bit like a clerk's office, or a little bit like a staff attorney's office, where they're going to be briefing the panel members on anything that comes before them. I actually think this is maybe the more interesting question, because having been a clerk for a judge, I know exactly how much I could influence my judge, and when to hit him with asking a really hard question on a case, before or after lunch. And so I think that the staff, who's going to staff out this panel and fill in the board members on the panels is going to be actually a huge concern.

David Skok: I think the thing that I'm trying to reconcile as I hear you both have this conversation, is we're talking about Facebook as if it's not just a nation state but governing body like the European Union or the United Nations, and in and of itself this is such a leap for me. But then the second part of it is all across the world right now. There's the increase of nationalism and populism, and the breaking up of these bodies or the erosion of credibility of these bodies, and so I find it fascinating that at the very same moment that the EU, UN and others are under real threat, Facebook is almost trying to create a new version of that for its own platform in a digital sphere. I don't know if there's anything to make of that other than just it's a curious time to be having these kind of conversations with Facebook.

Kate Klonick: Yeah, I think Taylor knows this about me already. I mean, my whole argument for years has been that we have underestimated in a certain sense the amount of governance that these transnational private platforms are doing. I think that we've made a mistake in thinking about them as companies, because they're controlling pretty fundamental human rights. I mean they're controlling speech, which is maybe our most important fundamental right in a lot of ways. And so I think for me this is a long time coming. It's not just me, there's been dozens of scholars way before me. So Rebecca McKinnon and Carlton Gillespie and Sarah Roberts and Danielle Citron, all of these people that have been saying that there needs to be due process, there needs to be accountability, there needs to be user accountability build into these black box systems of content moderation into these governance systems.

I just think that we're hitting the tipping point of finally realize this is what's happening, and it's a little daunting. To your point, I think it's completely daunting. We've been used to government being the backstop for a very, very, very long time, and I just don't think it is anymore.

David Skok: I still come back to at the end of the day, Facebook is a corporation. They have a board, they have accountability mechanisms, they're actually a public corporation which gives them even more oversight. I mean, it's almost depressing to ask, but is this a complete recognition or acknowledgement that the market will not solve the problem?

Kate Klonick: I guess I would put it this way. I think that the oversight board is a market solution, in which I think that Facebook sees the writing on the wall, in the sense that if they fail to... I think until the terror of war, slash, also known as the napalm girl incident, I think that Facebook had spent 12 years of its existence trying to ignore that content moderation was its product. And they finally realized that that was its product, and that they were just... all they were doing was taking a billion completely worthless moments or snippets of things, or pieces of content, and it was the integration of them, and the fact that people kept coming online and going to their site, and that they were relevant, that made them powerful and gave them a market.

And so to that end, I think what Facebook has always been terrified of is losing relevance, and losing market share in that sense. And part of that has been the erosion in trust of them since Cambridge Analytica, and since a lot of these other scandals, and since the 2016 US election. Since Brexit. And so I think that this is a very belated attempt to address that lack of user trust and maintain their relevancy in order to maintain their market share, if that makes sense.

Taylor Owen: I agree with you, it's a market response to a changing discourse, right? There's a fundamentally different attitude towards these companies amongst governments and broad populations then there was even two years ago. But I'm wondering on the counterfactual there. What happens if there hadn't been that market pressure? If the hadn't made this however enlightened change? Seems to me part of the story here is that governments, or at least democratic governments, and that's a whole other conversation we need to have, but democratic governments have abdicated their responsibility to do this difficult work. Right? To step into the governing speech in the platform ecosystem conversation.

Which is a wicked hard thing that they just didn't do, right? And because they didn't do, it was left to this market solution to either happen or not. And maybe it has, and we can talk about how effective it has been or will be, but what responsibility do governments have in stepping away from this and do you think it would be better if governments did this instead?

Kate Klonick: I really struggle with the question of the role governments and regulations should play here, because I think that from a Western European and North American perspective basically, that we're used to government in this way protecting our rights and our interests as individuals. But there's a lot of places in which Facebook is the way that people route around autocratic governments that don't allow them to speak about certain things in certain ways, or have access to certain types of information. I think, and not good, but the fact that it's one platform for both those types of places is exactly the wicked hard problem I think that you described.

Taylor Owen: What about where these decisions bump up against existing national laws, or when governments do step into this space, as they're starting to in some countries? Do you see, is there going to be conflict there?

Kate Klonick: Conflict between who?

Taylor Owen: Between the decisions of the board and national law?

Kate Klonick: Oh, actually no. That's actually a very good question. No, the board will not have anything in front of it that is specifically against any nation state's laws. So for example, if it's Nazi propaganda that has been published in France, you won't be able to appeal that to the board to put it back up, for example, because it's against a French an EU law.

Taylor Owen: The Nazi propaganda one though, is relatively clear-cut. But so much of this new law coming in is in the gray area, right? So if there's a harmful speech law, surely some of that stuff's going to come up.

Kate Klonick: Give me a for instance, because I'm struggling to see... if you're going to have something that's specifically against a nation state's laws, they just would geo-block it, right? And so they take it down and they geo-block it, and you wouldn't be allowed to appeal it. I guess it's like a first order question, if you can't have something going to the board that is illegal in the country of origin. So but to your broader question I think you're getting at, what happens if the board thinks that a certain type of content needs to stay up for whatever reason, because of the values of Facebook or whatever decides to cite to, and they think that that should be a global rule that is above a nation state's laws, is that what you're asking?

Taylor Owen: Yeah, and I could imagine a number of cases of that. I mean, if it bumps up against, certainly in illiberal-leaning regimes, right, you could imagine those values quite explicitly bumping up against domestic content policy. But even in certain democratic countries. I feel like some of the move towards, whether it's duty of care or harmful speech takedowns, leave so much ambiguity in the interpretation of that law, and forces that interpretation onto the companies.

Kate Klonick: Oh, completely, yeah.

Taylor Owen: So that interpretation is mandated by law, but it's being done by the companies, and then it's being overseen by this board. And that just feels uncomfortable to me.

Kate Klonick: Yeah, I completely agree with that. I don't think there's a good answer for that yet. This is part of why I have no idea whether this wagon is going to start off down the hill and the wheels are just going to come off. Some of these problems I think are a little bit foreseeable but a little bit not foreseeable. It's not clear whether that's going to actually end up being something that is a thought experiment that ends up being a conflict or something that's actually going to play out. I tend to agree with you Taylor, I think that it will be a problem, but I also maybe just wonder if there won't be some upper level of control of just not putting those types of issues before the board, or the board deciding not to hear those types of cases. And so that being a level in which that doesn't become an issue.

David Skok: Kate, I promised we'd get to it. In your report, you outline a few pitfalls in the process. Can you share with us what those are?

Kate Klonick: Yeah, I mean, I have no idea how this is going to scale. I think that's the first one. No one knows how it's going to scale. Is there going to be 300 appeals right out of the bat? How is the tooling going to work? And this is me being a super geek, I'm just fascinated of how they're going to tool this, because the board is supposed to be independent and set up through an independent trust, which is a multi-million dollar grant that will last for at least five years. Board members are supposed to be part-time, which means between 100 and 150 hours a year. They're starting with 11 of them and then they're supposed to build to 40, how is that going to work?

I should also have mentioned, they're supposed to do this all within 90 days. That's how long things live on the servers. So this all has to take place from being pushed through Facebook appeals to being pushed to the board to the case selection to the panel arrangement to then the decision being written to the decision getting okayed by the 40-person panel, because every decision has to be okayed by the full panel. And if they don't agree with it, it gets sent to another panel, and they do it all over again. And then oh, I should also mention they have to translate it to 88 languages.

So it's crazy amounts of moving parts. Those are the realities of it, are the parts that I think are just the very rudimentary, how is this going to work? But I don't know, did people say that after the Constitutional Convention? Did people say that about the circuit courts, when judges were riding around from town to town on horseback hearing cases? Who knows how this is going to end up.

David Skok: I grew up in South Africa where a whole bunch of actuaries were brought in to rewrite the Constitution, and it's actually not that dissimilar from what Facebook is going through now, albeit obviously on a vastly different scale.

Kate Klonick: No, if it's any consolation, I'm pretty sure that one of the main people that Facebook's been talking to is Albie Sachs. So I think that they are pretty aware of that as a model, so.

David Skok: Right, I still go back to Facebook, at the end of the day it’s still just a media company and not a nation or some international institution.

Kate Klonick: I'm curious, do you still think that Facebook is a media company?

David Skok: Well, I have always thought that Facebook is a media company, and I don't think I've wavered on that. I think that because they make editorial decisions all the time, whether it's done through other parties or through them. As you eloquently wrote in your New Yorker piece about the content moderators who are having to make decisions about removing a livestream of a shooting, those are the kind of decisions that we've had to wrestle with in what we put in front of our audiences on our pages for as long as I've done journalism. So for me, yeah. I've always considered Facebook a media company, and I still do.

Taylor Owen: I guess the question is, should they be regulated like a media company?

Kate Klonick: Early on they had moments when they called themselves a media company, and they haven't done that in a while. And I think that was a purposeful departure of language. I think that whether as a media company or as a governance structure, they wield just a tremendous amount of power for shaping the access that people have to knowledge and information, and I do think that how we decide to regulate it though, ends up being shaped by which one we think it is, unless we really create a new way to regulate it, or a new way to think about it and a new way to go about dealing with platforms. Because I think David, as you said earlier, you're in this moment, and I think a lot of people are, of just how powerful these platforms are, and things like the EU and the UN are being threatened, and in the meantime, Facebook is just off to the races with their new Constitution, and so it's a little terrifying.

Taylor Owen: Beyond that, how do we in democratic societies hold power to account, and that particular type of power to account, right? And make sure they're aligned with our existing norms or democratic processes that are already in place, right? And I'm not totally sure we do, which gets to the question about this board, I guess, is there are many accountability challenges and governance challenges with platforms, as you've talked about for years. And content moderation is one of them. Right? There's a whole host of other governance issues. And I guess I'm wondering, as we get to the end here, which problems do you think this solves?

Kate Klonick: Yeah, god, it's a really good place to end, because I think that your point is exactly right. I don't think that this solves the problem that you just raised, which is, is this still actually accountable? I was giving a talk to some journalists off the record about the board, and someone just exclaimed, like, "Oh my God, this is just a global censorship regime." And I was like, "What did you think there was before? What did you think Facebook was doing before the oversight board?" The oversight board in theory, and I think it will be actually in practice, their way to build at least some level of accountability or transparency into this, and if it's not, I think that Facebook won't be able to hide it, the robustness of it and the legitimacy in the way that it takes off.

That said, I don't know that it's actually going to have teeth, and that is the real thing to see over the next couple of months and years, is just whether or not this actually shows us behind the curtain everything that's happening inside these platforms.

David Skok: Do you believe that Facebook is sincere? That the leadership of Facebook is sincere in both understanding the gravity of the situation, and the impact their platform can have around the world? And also in wanting to clean it up?

Kate Klonick: I think Mark is sincere, let's put it that way. I think that Mark Zuckerberg is sincere. I think that he actually wants to clean this up, and wants to do the right thing. I don't know what upper-level management, their motivations are. I think they're though they're very different from Zuckerberg's. And I know that Zuckerberg had a lot of pushback, tons of pushback from the board and from upper-level management in creating the oversight board. And he nonetheless went forward with it, and insisted on it happening. And one of the reasons that he's made so many public statements about it frankly, and made those statements in April of 2018 to Ezra Klein and to other people was that because he wanted to start going on the record and putting it out there and shaping expectations, and making it hard for people to push back on him, frankly.

I would say that his motivations as the founder and the majority shareholder are different than the people who are jockeying for power at the upper echelons of a tech platform, just by the very nature of their incentives. I guess that's all I can speculate on.

David Skok: Well Kate, it's been such a treat to talk to you and to learn some of your insights, which are to me at least, quite surprising to know the depth to which Facebook is treating this. So I'm thankful that you've taken the time to talk to us, and I'm hoping we can speak again soon.

Kate Klonick: Thank you so much.

David Skok: That was Kate Klonick, assistant professor of law at St. John's University Law School, and an Affiliate Fellow at Yale Law School. She's the only person embedded with Facebook and researching their process of building an independent oversight board for moderating content. Her report on that will be out shortly.

[MUSIC]

Taylor Owen: Dave, I don't know what you thought about that, but I was struck by how unique she herself thought the position she was in was. The access she had been given, how open the company was with them. But I didn't feel that the core tensions that this process is trying to engage with around the scale of the platform and the complexity of these content moderation problems are necessarily going to be solved by this process. I think it'll be fascinating to watch this experiment as it plays out over the next years.

David Skok: I am in awe of the access she was given. It sounds like Facebook really opened up their doors to her in a way that, as a journalist who's been covering this company for such a long time, that access is just not something that they've given to others, and so I'm both in awe of it and envious of it, and a little miffed to be honest.

Taylor Owen: It also comes with complexities, right? I mean, why her? What power are you giving to a person? Why were they doing it? It's not an easy position to be in on her side.

David Skok: The cynical part of my brain thinks this is very much a part of an effort to control the message, and to control the way that they are viewed. And it’s a really shrewd PR play if that’s all it is. I'm still on the fence as to whether it is a shrewd PR play, or whether it's a sincere effort to combat the hate and misinformation that happens on their platform.

Taylor Owen: Yeah, I mean I think at the end of the day, they really want the public to know both that these problems are really hard and complicated, and there aren't clear solutions to them without difficult trade-offs. And that they are trying really hard to reconcile that. And it could be that revealing the process they went through to create this oversight board serves both of those purposes.

David Skok: That's it for now. I'm David Skok, founder and editor-in-chief at The Logic.

Taylor Owen: And I'm Taylor Owen, senior fellow at CIGI and professor at the Max Bell School of Public Policy at McGill.

We hope you enjoyed this conversation, and if you did, please subscribe at Apple podcasts. And we'll have more episodes coming soon. Bye for now.

[MUSIC]

Narrator: The Big Tech podcast is a partnership between the Center for International Governance Innovation, CIGI, and The Logic. CIGI is a Canadian non-partisan think tank focused on international governance, economy and law. The Logic is an award-winning digital publication reporting on the innovation economy. Big Tech is produced and edited by Trevor Hunsberger, and Kate Rowswell is our story producer. Visit www.bigtechpodcast.com for more information about the show.

For media inquiries, usage rights or other questions please contact CIGI.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

Opinion

Read