Everybody Cares about Democracy and Technology David and Taylor Look at the State of Big Tech Governance

Season 2 Episode 10
Everybody Cares about Democracy and Technology: David and Taylor Look at the State of Big Tech Governance

S2E10 / August 27, 2020

Guest Headshot - Both v3.png

In this episode of Big Tech, co-hosts David Skok and Taylor Owen discuss how our understanding of the impacts big tech has on society has shifted over the past year. Among these changes is the public’s greater awareness of the need for regulation in this sector.

In their conversation, David and Taylor reflect upon some of the major events that have contributed to this shift. The COVID-19 pandemic highlighted the need for better mechanisms to stop the spread of misinformation. And it has shown that social media platforms are capable of quickly implementing some measures to curb the spread of misinformation. However, the Facebook Oversight Board, which their guest Kate Klonick talked about in season 1, is not yet operational, and won’t be until after the US presidential election; even then, its powers will be limited to appeals rather than content oversight.

In July 2020, the big tech CEOs testified in an antitrust hearing before the US Subcommittee on Antitrust, Commercial and Administrative Law. “That moment,” Taylor Owen says, “represented a real turning point in the governance agenda.” This growing big tech antitrust movement is showing that law makers, now better prepared and understanding the issues more clearly, are catching up to big tech. The public is starting to recognize the harms alongside the benefits of these companies’ unfettered growth. In season 2, Matt Stoller spoke with David and Taylor about monopoly power, and how these modern giants are starting to look like the railroad barons of old.

From diverse perspectives, all the podcast’s guests have made the point that technology is a net good for society but that the positives do not outweigh the negatives — appreciating the many benefits that platforms and technology bring to our lives does not mean we can give them free rein. As Taylor explains, “When we found out the petrochemical industry was also polluting our environment, we didn’t just ban the petrochemical industry and ignore all the different potential positives that came out of it. We just said you can’t pollute any more.” With the technology sector embedded in all aspects of our democracies, economies and societies, it’s clear we can no longer ignore the need for regulation.

Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

 

David Skok: Hello, I'm David Skok, the Editor-in-Chief of The Logic.

Taylor Owen: And I'm Taylor Owen, a senior fellow at the Centre for International Governance Innovation, and a professor of public policy at McGill.

David Skok: And this is Big Tech, a podcast that explores a world reliant on tech, those creating it, and those trying to govern it.

[MUSIC]

David Skok: So, Taylor, this is our last episode of season two of Big Tech.

Taylor Owen: It is. We've done 19 conversations over the last year, diving into all aspects of this pretty big topic.

David Skok: And given that this is our last episode of season two, it's also a good time to tell you that this is my final episode of hosting Big Tech.

Taylor Owen: Yeah, we've had a great time diving into some of these big issues together, and it's been a real learning experience for me, both in doing the podcast, doing it together, and learning from you on how to host this kind of thing.

David Skok: We wanted to spend this final episode, having a conversation, recapping some of the big issues that we've covered throughout the two seasons of Big Tech. Taylor, as you all know, is an expert in many of these issues. And I wanted to hear from him what he thought about where we are going and how technology is changing through this incredibly intense and monumental time. So let's dive right into it, Taylor. When we started this podcast, it was pre-pandemic. Nobody had heard of COVID-19. I'm curious how you think the issue of technology companies has evolved over these two seasons, but particularly from the pre-pandemic to the middle of the pandemic. What is the state of big tech in the fall of 2020?

Taylor Owen: Yeah, I mean, it really is remarkable. When we got together, over a year ago now, to talk about doing this show, you had already started a publication reporting on this topic. I had been studying this topic, and I think we both thought it was something that needed to be highlighted and focused on, both journalistically and academically. I don't think either of us could have imagined that the topic would have been thrust into the public debate via the pandemic in the way it was. And like other policy areas, I think in some ways the pandemic has sped up the conversation. That a lot of issues that were brewing under the surface, whether it was economic inequality, or in this case, the power of digital platforms and firms, sped up during the pandemic. And I mean, even just the very simple fact that we're all now living online, we're doing this online. Our kids are going to school online. The government's functioning online. Even just that fact has sped this process up. But at the same time, as we have moved online, and as the global economy is nearing a potential historic depression, there are companies that are doing very, very well. And those companies are the tech companies, the very companies we wanted to focus this show on. David, you've been reporting on their size. Can you give some perspective on the scale these companies have grown over the past six months?

David Skok: Well, yeah, and that really is something I'd love to dive into with you. When we talk about the tech companies, let's first talk about their market cap, and their evaluation. These are companies that now have a $5 trillion market cap when combined. They have really been the stalwart of the stock market during this period. And I can recall in March, early March, the market was crashing, and I thought, and I'm sure others felt at the time, that this was it, the bubble was bursting and the stock market was going to correct itself. Well, that didn't happen. In fact, after a drop, the market went back right up to where it is, and is now once again, seeing record highs. And what's been remarkable about that is how few companies have actually driven that ascension. So Taylor, maybe you can talk about that. The financial gains that these companies have had, relative to, not only their competitors in the market, but also just the overall market. How did we get to a place where that could happen?

Taylor Owen: I think it reflects the nature of the problem that a lot of people have been flagging for a long time, which is that these companies aren't just digital media platforms, social media platforms, delivery companies, they are embedded in all aspects of our society and our economy. And what that means is that as we move more digital, they capture more and more of the global economic activity, because they are many things. Their sinews are embedded in all aspects of our society. So the fact that they have grown with this scale is a reflection of them sitting in so many different markets, that growth in all of them is all being consolidated in a few of these companies. And that same problem is reflected on the governance side too; it's what makes them so hard to govern, is that these companies aren't just one thing. Even if a company gets very big – if you are just a car manufacturer, for example – it is not that difficult to govern it. We've done this before. But when you are a car company, a bank, a health service provider, a social media platform that governs the free press, if you are all of those things together, it is very difficult to govern. So it's this combination – of market reach, and diversification of a few small firms, and the inability to govern them, which allows them to grow without any guardrails –  that has sort of got us in this point, I think.

David Skok: And yet, if you look at even sentiment surveys for all of these companies, or you look at the public market and their usage, most people would tell you that without these companies, they probably couldn't have gotten through the pandemic as well as they did.

Taylor Owen: Yep.

David Skok: How do you reconcile the public opinion versus the public policy, on approaching big tech? If you're a politician and you know that your constituents love Amazon. In fact, they couldn't have gotten through the pandemic without Amazon. How do you go to them and tell them, we need to regulate this company? I have yet to hear an answer from any politician or any expert that actually addresses that fundamental discrepancy.

Taylor Owen: Yeah. The challenge is, is that, of course these companies provide incredibly useful services. The ability to get free information via Google, the ability to get packages delivered the next day, or the next hour, from Amazon. All of these things are incredibly useful, but they also potentially have negative side effects. And it's the role of governments to figure out how to engage with the negative things that are affiliated with these, any company in society, while maximizing the benefits. And so many of the policies being proposed in this space, actually don't harm the positive aspects of this, they just address the downside risks. So I think that's the core of how we have to frame this conversation. How do you get at these negative things that are increasingly apparent to everybody, right? So you mentioned popular sentiment, and popular the sentiment's actually in a radically different place now than it was two years ago. It is shifting quickly. People are seeing the negative costs to some of these companies. They're just not seeing a cohesive agenda on the policy side, that targets those negative things, without undermining the utility that these companies clearly provide.

David Skok: Yeah. You make a good point. I mean, when I think about my own family, the benefits of the technology are there, but also during the pandemic, I have seen the shortcomings of it when it comes to screen time, for example. My son's willingness to engage with screens, and obviously the addiction that comes with that, and having to manage that. But there's lot more, and I think if there's one episode that I wish we had done on this podcast, that we haven't done, it's one on children in technology, and the dangers and perils of being trapped in this virtual world. That seems to be one area of policy that if I were a policymaker, I would say, look, ultimately this is about protecting your kids. I don't think there's really been a strong effort, other than maybe on the YouTube side, in that approach. And then of course, what has become quite stark during the pandemic is fake medical news. That has really sprung, at least Mark Zuckerberg, into action. But I'm curious if you could talk about those two things, how children are affected by technology, and how fake news has since the pandemic, become a more crystallized and strong argument, because it was about medicine.

Taylor Owen: Those are two such good examples, right? Of the visceral way people are seeing the harms now. It's very different to be told that consolidation in online markets is leading to a lack of competition, and that's a problem. To being told fake information about a global pandemic is circulating widely, and leading to changes in behaviour that are harming society. Those are different things, right? And one you can really viscerally feel. And similarly with kids in tech, I think you pointed out this tension, which I think everybody is feeling. I have a small child too. And at the same time, as all of these online tools are allowing us to continue to work, and particularly women to continue to work, to a certain degree. I mean, if this pandemic had happened even five years ago, the amount of gender disparity in who was pushed out of the workforce, would have been even more severe than it is now. There's no question about it. That these technologies are allowing us to imperfectly continue to function as a society and an economy. At the same time, we're seeing the effect it's having on our kids. I mean, kids are spending way more time on screens, whether it's through school or otherwise, and that is changing their behavior. And we don't know enough about how that will also change their mental development, their cognitive development, the way they function socially, and parents are really worried about that. So, those two things are even remotely true, that technologies are having a adverse effect on children, and they are crippling our response to a global pandemic. Clearly this is a space where we need to be talking about. What kinds of regulations and policies need to be in place to correct for those negative outcomes of our current digital economy? Right? That is not a big reach for me to think we should be having that conversation. But another theme that's really emerged, and I think through our conversations is, is the way governments have not wanted to have that conversation. That they have in many ways, enabled the growth of this digital ecosystem through their lack of regulation. And because of some of the sensitivities, particularly around things like free speech, the rightful right sensitivities, and about wanting to make sure we're maximizing digital growth, governments have been very reluctant to step into this space. Now that's changing. But I think it's pretty clear across our sort of 19 conversations that a real common consensus from everyone, from people in the biotech space, to people in the digital finance sector, to people in the content moderation conversation, everybody wants more democratic engagement in this conversation, and hopefully that's coming.

David Skok: So, I have to ask Taylor, last month when the big CEO's appeared in the antitrust hearing, Zuckerberg, Bezos, Cook, and Pichai, what was your impression of that moment? Not the theatrics of it per se, but what did you actually... If I'm going to reframe this question, I would be more interested to hear what you thought of the politicians asking the questions than about the answers that you got from the CEOs.

Taylor Owen: Yeah. I think that moment represented a real turning point in the governance agenda. The questions were specific and smart, many of them, and that committee will be making a recommendation, and issuing a report, that are going to create a roadmap for potential antitrust movement, against probably at least two of those companies. And I think have already changed the landscape. So, this is the remarkable thing about the policy process is that, even the threat of action, the legitimate threat of action, from governments can have a real impact on private sector behaviour. So already, since those hearings, we have seen a number of companies push back against Apple's app store policies. We have seen no large, other than Microsoft, but none of the big tech platforms that were testifying, try to buy TikTok. That would have been different five years ago. And that is because there is now attention being paid to market concentration by legislators, serious attention. And so I think we are just in a different moment where governments are taking this more seriously. That means staff, and civil servants, and policy makers, and the policy apparatus, is taking it more seriously. It means the academic community is engaging with those policy makers more. And so we were seeing a different conversation. How far that goes, is an open question. Because there's a lot of other competing variables here, not least of which is global geopolitics. But I think we are in a different moment than we were a few years ago.

David Skok: So, one of the arguments Mark Zuckerberg made at that antitrust hearing, and he's made it before, is that essentially you're either with us or against us, that Facebook is America's best hope against China. You mentioned TikTok. Really, since that hearing, we've seen the Trump administration take some very dramatic steps against not just TikTok, but also Tencent. And long-term I'm curious, is Mark Zuckerberg, right? Does he have a case on this?

Taylor Owen: Where he is right, is that an internet governed by democracies, that values free speech over autocratic control and power, is undoubtedly better than a competing model that's emerging. Which is a Chinese developed, and now being exported globally, stock of technologies that allow for centralized control. Where he is wrong, I think, is in how best to achieve that, for a number of reasons. One is, that many of the current technologies governed by democracies, notably Facebook, Google, Amazon, are inching towards many of those same autocratic tendencies. They allow for centralized control. They allow for, what I would argue is, an illiberal governance of speech, where we don't know why voices are amplified, whose voices are amplified, whose are paid to be promoted. That very infrastructure has many of the same failings as the fundamentally illiberal infrastructure being developed by China. And so in my view, the answer to the Chinese model is to make ours more democratic, not to allow it to scale in a less democratic way. Two interviews we did really sort of drove this home to me. Joe Stiglitz, when he flagged the challenges of this consolidation of economic and political power being fundamentally undemocratic. That it's not just the scale of these companies and their market caps, but also the political power that that wields, or that leads to.

[CLIP] Joseph Stiglitz: One of my major concerns is the impact on distribution and political power, particularly in a country like the United States where money matters so much in politics. So the point is our society, there's a kind of accountability, except for these big tech companies. And why? Because they've used their political power to free them from that accountability. And of course that means more profits for them.

Taylor Owen: I think that's where we're at here. And we need to push back against that, not just because of the economic power, but because it leads to undemocratic rule. And even more pointedly, Maria Ressa, who made the really provocative case, that it's not just that these platforms enable the free speech of autocrats, but that there's actually a liberal and autocratic tendencies embedded in the design of these companies themselves.

[CLIP] Maria Ressa: Regardless of how they paint themselves, they are behavioural modification systems. That is what they have become.

Taylor Owen: And that's, I think, a profound statement on where we need to go. That this cannot be a game of whack-a-mole of bad actors, going after bad actors using these tools, but we need to focus our attention on the structures of the companies themselves.

David Skok: So all of that sounds wonderful in a vacuum, but we are living through, arguably, the biggest economic crisis since the great depression. Yes, the stock market, as we talked about earlier, is on fire, but people don't have jobs. And when governments get into these situations, they look for stimulus. Some of that is going to be around green economy. Some of that's going to be around income supports. But I'm wondering, and most people probably don't even know about it, but the idea of digital infrastructure. What is it, and what role can government play in building up a new economy, a new deal of sorts, around digital infrastructure that maybe even works around these companies? And doesn't do what they've been doing, which is looking and relying on these companies, to build this on their behalf.

Taylor Owen: Yeah. I mean, I think that's a huge question we're going to be wrestling with over the next decade. It has a few components that I think are worth flagging here. One is just literally the physical infrastructure of access to these technologies, and access to the digital ecosystem itself. I've just had this remarkable experience where, we're living in a house that's about five minutes from a town in BC this year, due to the pandemic, and we can't get broadband. And as someone who's trying to function in the digital economy and live online, it makes my life very, very difficult. And I can only imagine the challenge to those who are both less financially capable of getting around that system, and are in places where there literally is no other alternative, and that is big parts of the country. And so even just getting our digital infrastructure up into a point where people can get it and afford it, is a big problem. A second big issue is, yes, coming out of the pandemic, and coming out of the either recession or depression that it has caused, we are going to see a refocusing of government on large scale projects. Whether they be social support, new economic systems, new large scale infrastructure projects; and they are going to need money. So where are governments going to look for money? Well, most aspects of the economy, the traditional people who have provided tax revenue, and organizations that have provided tax revenue to the governments, are declining. Whereas, we just talked about, the tech sector is booming. So governments are going to be looking to the tech sector for revenue. And right now, large platform companies are almost entirely untaxed. And this a huge problem, and it exists in two different spaces. One, these companies don't pay sales tax in the places where their goods and services are bought and sold. So this is the famous Netflix tax issue in Canada. And the second is corporate tax. Because of sort of global corporate tax arbitrage. Most of these companies, these large platform companies, are based in Ireland, where there is very little, to no, corporate tax. And so they are just literally not paying corporate tax. So Amazon, and Apple, and Facebook, which have grown exponentially over the course of the pandemic, will be paying very little tax, if any, to the governments in which they operate. And so that's going to have to change. And the final piece, I'm rambling on here, is if we see all of this as our digital infrastructure, if we see that Facebook and Google is the place where our media exists and where we share information, if we see Amazon as our marketplace; then we are going to have to start governing them with the similar scale and seriousness with which they function in our society. And that is where we come back to the policy conversation. That this is just, it is untenable to cede governance and the oversight of so many aspects of our society and our economy to companies that exist outside of our jurisdiction, and outside of our democratic control. And that is going to change. So if you think this is infrastructure, if this is meaningful infrastructure, then we need guard rails. And I don't think that's a radical thing to say. It's a very difficult conversation, but I don't think it's a radical thing to say.

David Skok: Just a couple more things, we're running out of time here, but I wanted to get your thoughts on a few tactical things before we go. One is the Facebook Oversight Board, and how it's played out. We had Kate Klonick on, very early on. And just to recap, everyone, Facebook announced an oversight board, which for better, for a lack of other term of saying it, is effectively a supreme court overseer of Facebook.

[CLIP] Kate Klonick: I would say that it was kind of one of these things where I was a little bit shocked. I was the only person in the room. It just seemed a very enormous project, and simultaneously you're like, how big could it be if I'm the only sitting here listening to this happen? You know?

David Skok: I'm curious Taylor, if you can update us on how it's going, where it is, and what role you see it playing in the future?

Taylor Owen: Yeah. It's been fascinating to watch. And I think we're really lucky to get that window into it in the fall. And I think part of what's happened is, the board has been eclipsed by events and the scale of the problem. The board, it's very clear, their mandate is highly limited. They're just looking at a few specific cases of things that were taken down that someone thinks shouldn't have been taken down. They're not even allowed to look at things that stay up, that people think should be taken down, right. It's only a complaint driven process of someone who thinks they've been falsely censored. And at the same time, we've seen just the complexity and the magnitude of the content moderation problem, in a clarity that shows the real limits of that kind of limited private governance model. Even just take the American election. I mean, you have false information about a global pandemic, and about the political actors in that pandemic, circulating widely on these platforms, being amplified by the President of the United States. None of that is under jurisdiction of the board for one, and the board isn't even going to be formed until after the American election. Arguably, the single biggest test of platform power over our communication infrastructure, that we've ever seen. So I have real mixed feelings here that in one sense, yes, it's good it exists. It was created due to a lack of democratic governance, right? Governments, weren't stepping up to tell companies how to govern, which is the fundamental failing in the root cause of this, I think. And public pressure for the company to do more, to what the public was growingly perceiving as a problem, which is hateful and harmful speech circulating widely through our public sphere. But the solution just feels both far too limited, not democratic enough. Government should be involved in this. And have really limited utility to the core structural problems that exist here, which is that false information and the ability to manipulate the behaviour of users, is still a core attribute of the infrastructure that we do rely on. So, I mean, it feels like it just hasn't had the impact it was designed to have, because it was improperly designed, and it doesn't even come close to getting out the problem it pretends to solve.

David Skok: Okay. A couple more questions for you, just a couple of rapid fire ones, and I'm hoping you can get at. One, throughout our two seasons, we've covered so many different areas, and all of them seem to come back to the idea of regulation. And you've mentioned and talk about this a lot. Just a visual image, if you can paint a visual image for us of a coffee table, how many legs would this regulation piece need to have, and what are they?

Taylor Owen: I think there are three legs. One is content policy. Governments, democratic governments, need to engage with what they think should, and be allowed to be said on digital platforms. How that is going to be determined and overseen, and how the design of the platforms themselves enables new kinds of speech, that may or may not be desired in our society. So content is number one. The second leg of it is data. We live in a world where intangible assets, such as data, and access to data, and control of the tools that can make sense of data, are of growing economic value and societal impact. And we probably need to revise our data policies to get a handle on that. Those are things like our IP policies, our data privacy policies, our ability to hold companies liable for breaches. And the third is our economic policy and our competition policies. We need, and governments need, to figure out, are our national economic interests aligned with those of Facebook, Amazon, and Google? And maybe they are. Maybe an unregulated digital economy is in our national interests. I suspect it is not, but I think that's a conversation we need to have.

David Skok: It sounds so simple. And then you start to think, well, in the Canadian context, content policy falls under heritage. Data policy falls under ISED, and maybe privacy commissioner. Economic and competition falls under competition commissioner. So you start to see how complicated all of this is, in having to come up with a policy framework for these companies, and how they're approached. And I think, I hope at least that, that through this podcast, over the last two seasons, we've been able to shed a little bit more light in some of these areas, to give people a sense of just how complex this is, and why it's so hard to get it right. One last question for you. Well maybe I'll parse it up into two. The first one is, which of these companies scares you the most, or that most concerned about? And the second one is, what are you actually optimistic about? And what do you believe in, with these companies, that they get right, or will get right in the future?

Taylor Owen: I mean, look, there are attributes of many of the companies that we've talked about through these two seasons that concern me. The core issue that I think is most urgent and pressing, and is present in various ways in all of them, is the lack of integrity in our public sphere. And I think this strikes at your world too, in journalism. We had, for a long time, outsourced the control of quality information inside our democracies, and came to rely on that role to traditional journalistic institutions. And as you know, far better than me, that model was both deeply flawed and unsustainable in the current digital economy. And so aligned with, I think, the interests and structures of these companies that don't prioritize quality information over content that engages or drives attention. We need to figure out as a society, how we are going to raise the money to produce and control the ecosystem to distribute quality information. I mean, democracy ultimately depends on that. The thing that concerns me the most is, what is the impact on our democracy of this very fragmented, as good as it is, as great as it is, right, way more voices are heard, way more people are participating in the public sphere. But the vulnerability of that ecosystem, tech manipulation and hijacking, is really, really worrying.

David Skok: Okay, now I'm going to put you on the spot. What do you like about these companies? You're not getting away with... On that note, we're going to leave this with an optimistic tone about these companies.

Taylor Owen: That's easy. I mean, I like a lot of what these companies do, right. I think it's amazing that more people can speak than ever before. That I can hear voices from all over the world. That I can participate in a digital economy in a way that I never could before. That we can do this podcast remotely together and reach audiences. That I can order things, and they arrive on my door the next day, even though living in a rural environment right now, right? The upside is tremendous. But saying there's upside doesn't mean we shouldn't address the downside. So you constantly hear that like, well, we're a majority good, so don't worry about the bad. But in no other aspect of society do we say that. When we found out the petrochemical industry was also polluting our environment, we didn't just ban the petrochemical industry and ignore all the potential positives that out of it. We just said you can't pollute any more. So I actually don't think it's much more complicated than that, at the end of the day.

David Skok: Well, I'll just say, on a personal note, it's been a real thrill over these past 20 episodes to hear you. Yes, our guests, but I continually learn something from you every time I hear you speak. And living through this period over the course of this podcast, it's been fascinating to kind of be along for the ride, as we've tackled some of these questions. And as you have explored intellectually some of these things with our guests and with our listeners. So I'm grateful that you took me along for the ride on this journey.

Taylor Owen: The feeling is very mutual, and this has been a remarkable experience, and learning experience for me too. And I think the goal is to try and continue to have these kinds of hopefully thoughtful conversations about this complex digital world we're all struggling to understand. And to do it as well as I can on my own. But it's been a real thrill and pleasure to do this together over the last 20 episodes.

[MUSIC]

Taylor Owen: Well, that's it for season two of Big Tech. I want to thank Dave for two great seasons. We're going to take a short break and I'll be back in the fall. Make sure you hit subscribe, so you don't miss episodes when they're posted.

Big Tech is presented by the Centre for International Governance Innovation, and The Logic, and is produced by Antica Productions. I'm Taylor Owen, and I'll see you in a few weeks.

For media inquiries, usage rights or other questions please contact CIGI.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

Opinion

Hosting Big Tech in a Year of Pandemic

Following the conclusion of the podcast’s second season, Taylor Owen reflects on themes that resonated among the 19 wide-ranging conversations — many in the context of the pandemic — he and co-host David Skok shared with a remarkable set of guests about how emerging technologies are reshaping society.

August 27, 2020
Taylor Owen
Read