Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

 

David Skok: Hello, I'm David Skok, the Editor-in-Chief of The Logic.

Taylor Owen: And I'm Taylor Owen, senior fellow at the Centre for International Governance Innovation, and a Professor of Public Policy at McGill.

David Skok: And this is Big Tech, a podcast that explores a world reliant on tech, those creating it, and those trying to govern it.

 

[CLIP]

David Cicilline: The purpose of todays hearing is to examine the dominance of Amazon, Apple, Facebook and Google. Amazon runs the largest online marketplace in America, capturing 70% of all online marketplace sales…

SOURCE: House Judiciary YouTube Channel

https://youtu.be/WBFDQvIrWYM

“Online Platforms and Market Power, Part 6: Examining the Dominance of Amazon, Apple, Facebook, and G”

July 29, 2020

 

David Skok: On Wednesday, Jeff Bezos, Mark Zuckerberg, Tim Cook and Sundar Pichai all appeared in front of Congress as part of a landmark antitrust hearing. The Four men lead companies with a combined market cap of almost 5 trillion dollars.

Taylor Owen: What this hearing means for Big Tech still remains to be seen but -- at the very least -- it’s a sign that politicians in the US are beginning to take these issues more seriously. While the US has largely taken a handoff approach to governing big tech, other countries are moving forward with broad platform governance agendas Our guest this week, British conservative MP, Damien Collins has been at the centre of this debate. domestically, he led a prominent British committee investigation into Cambridge Analytica’s role in the British elections. And internationally, he’s been at the core of a network of parliamentarians who are collaborating on how best to govern big tech. Called the international grand committee, it includes members in Canada such as Nate Erskine Smith and Charlie Angus. And I actually appeared before it when it held meetings in Ottawa last year. And together these parliamentarians are collaborating on how to take on the tech giants.

David Skok: We sat down with Mr. Collins on Tuesday just ahead of the antitrust hearings in Washington to talk about his ongoing efforts to regulate the platforms.

 

David Skok: Damian Collins, welcome to Big Tech.

Damian Collins: Thank you. It's great to be with you.

David Skok: So, we're recording this ahead of the antitrust hearings tomorrow. I'm wondering, what would you like to see come out of those hearings tomorrow? What are you expecting, as well?

Damian Collins: Well, as someone who's questioned tech companies and watched other people do the same, what happens on the day is often far short of what you'd like to see happen. Often, it's a process of evasion of answers, but I think the stakes are being raised against the tech companies now. I think as a consequence of competition investigations, I think a much more heightened level of interest now about platform policy against harmful content, much more pressure on the companies to take decisions. If you think in terms of taking decisions about speech, we've seen the companies already take decisions I don't think they would have made a year ago. I think coronavirus has had a big impact on that, and I think the pressure is on Facebook in particular because they are not moving as fast as some other companies are, as well. I think it puts them in a more exposed position, so I'd expect to see questions around that too. I saw today that the UK advertising industry press is reporting that digital is now more than half of the world's ad spend, and when that is dominated by two companies, I think you can see why these antitrust questions come up. And in Britain, the Competition and Markets Authority report has not only highlighted that, but also said the response to that should be regulation. I think that is the big word I think we're seeing now in different spaces that these problems require some form of external regulation to hold the companies to account, and these hearings are happening at a very interesting time, but I would see them as part of an ongoing process, but one that I think that is moving against the companies and towards a position of probably more intervention and more regulation down the line.

Taylor Owen: So, you know David Cicilline well, I'm sure, and he's been chairing this process for going a year now. And these hearing this week are really a culmination of their investigations. Do you think he’s heading to the point where he is going to recommend trust-busting or breaking up these companies? And part of the challenge, I guess, that I’d love to hear your thoughts on is, these are ultimately very different companies, with different business models and potentially different sets of anti-competitive behaviours. Where do you think this committee is heading?

Damian Collins: I think it's a good question and so, the defence of the tech companies is that because they're different and indeed they have competing interests. So, and indeed Apple talking about blocking ad tech on all its devices shows that actually it can quite easily go to war with companies like Facebook and Google because its business model is so different. I think that will be their main defence for saying, "Well, actually, there's not antitrust situation, because we've got people who are dominating different bits of the market, but they also compete against each other. And that's where this is, and this is not Standard Oil or Microsoft or IBM, this is something that's very different." I mean, I think the kind of the price big companies pay for not being broken up is that some other bodies are created to preserve the consumer and protect the consumer interest. I think what we'll have to move away from is the idea that these are problems the companies either can or will solve themselves because they've really not shown any... The only desire they have to act is either to gain, I think, a bit of an advantage over each other, so that they're not the worst offender, someone else says. I think Google's been very clever at sort of always staying kind of one step ahead of Facebook in terms of its approach to regulation, and I think listening and cooperating with governing bodies, but I think the direction traveled is slow and I'd be fascinated to see what David does recommend. But from my point of view, I think creating bodies to establish the standards that we want to see, may be an easier route than trying to try to look at breaking these companies up.

Taylor Owen: Yeah. I mean, that's a fascinating aspect of this, is you can actually get to some of the competition policy outcomes through other forms of policy, right? Whether it's a data privacy... And so you think then that those are more effective avenues in to the anti-competitive practices than breaking them up?

Damian Collins: They could be easier to achieve. I certainly think looking from a European point of view, I think that will be a more likely outcome. And I think we've seen that as well in some of the antitrust issues the European commission has taken against some of these companies, particularly Google on search as well, which is to, there are fines and punishments, but also just to stop them doing the things that they're doing that are creating an unfair marketplace. We need to be certain that Amazon is not creating an uneven playing field whereby retailers that don't have a relationship with Amazon find it hard to sell them their platform. If you've got a dominant position, there should be a fair marketplace and there needs to be a degree of transparency around how that marketplace operates. I think the same for the ad market as well about pricing and transparency. The issue there, which I think the traditional media faces though, that is less that there's not enough competition amongst online advertising platform, it's just that the digital space is dominating all other forms of media to their detriment. And I think we may have reached a tipping point on that. I think certainly from Europe, I think it also spills into tax policy. And I think coronavirus accelerates that issue as well, which is that, why is it that if online retailers are booming and yet the traditional bricks and mortar shops and businesses are going bust at an increasing rate and yet tax policy massively favours online retailing, then how can that be right? Where actually our tax policies are accelerating a problem of the retreat of high streets and the strengthening of online retailers. I think it makes it more likely, therefore that we would look at the idea of online sales taxes. There are different policy levers you can pull to protect, and in some ways they each address different of disruption caused by the success and strength of these major companies.

David Skok: Yeah. And that's kind of, you walked into my next question, which is if you step back, I mean, that's easy already in this conversation, we're going right into the weeds and it's hard to see the forest from the tree sometimes and how these things all interplay with each other. How do you, when you were approaching this from a policymaking perspective and dealing with your stakeholders and government, what kind of framework do you use to, I guess, make sense of the entire landscape that these companies cover?

Damian Collins: I think if we step back to look at the interest of society in this problem. So, if we think, "Is it a problem that hate speech proliferates online and that social media is driving people towards hate speech and extremist content? Is that a problem?" I think a lot of people say, "Yeah, that is a problem." And I think the anti-vax movement in a public health emergency has helped people see that even more, about the problems of harmful content. So you step back and say, "Okay, this is a new problem in society. What do we do about it?" Well, actually, unlike any other form of media, we don't really have any many levers to pull because we've allowed a situation to develop where the tech companies are really independent and not really legally responsible or liable for a lot of what happens on their platform. So you say, well okay, we want to solve this problem, and we're not satisfied that the companies will do it themselves, we have to find a different solution, which could be some form of a regulation. That if we feel that the ad market is being dominated by two big tech companies with no transparency over pricing and lots of things they can do to manipulate the market to suit their own interests. Well, in most, we would say, well, actually that's massively to the detriment of lots of other companies that want to innovate in that space or businesses that want to advertise. And therefore, if the companies can't effectively self-police themselves, then you look at some form of market intervention. And so I think you look at a series of issues that have arisen. And some of them, you might say, they're not, maybe not necessarily the fault of the tech companies. These problems have arisen, but they're a consequence of their success. And you say, "Well, if we did nothing, what would happen?" And therefore, if that means that hate speech and radical content proliferates online with no editing, no intervention, then that's probably going to be bad for democracy. If we allow one form of business to be lightly taxed and very successful, and that drives other forms of businesses out, many of those businesses that have a big impact on how society looks, how our communities look, and that could be a bad outcome. So I think that's what policymakers have to do is not necessarily say that tech is a problem it's to say there are problems in society, and the model of some of these tech businesses may be making those problems worse. And therefore, what as legislators do we do to solve those problems?

Taylor Owen: Part of the challenge of course here is that this digital space just encompasses so much of our lives, our economies, or societies that anything you do in one domain… any policies you put in place necessarily have trade-offs on the others. I mean, just take the free speech debate, there’s a real tension in this governance agenda between protecting an individual right to speak on these platforms and protecting individuals’ rights not to be harmed by that speech. And I feel up until now, we’ve really privileged one side of that debate in this kind of unregulated environment where everybody is free to say almost anything. But I’m wondering how you think of these trade-offs, it must be a very complex policy agenda to step into as a leader having to navigate, not just what the best policy is for one issue, but how are they all going to fit together in a way that embetters our democracy?

Damian Collins: Yeah, I mean, let's say take free speech as a good example of that. I think the tech companies hide behind an absolute version of free speech, which is that they don't really have a right or responsibility to intervene in at all, but I think that we have to look at it and say, there's a difference between freedom of speech and freedom of reach. Someone may have a right to express an opinion. They don't necessarily have the right to have that opinion amplified. There's no first constitutional right to have your voice amplified by the Facebook algorithm. And therefore, I think that's where the judgment of the companies can be called into question. So actually many of the problems here is it's not that it's your business model, it's the tools that you've created, which are being exploited by people to make sure this piece proliferates. And that's the problem we're trying to solve. We're not trying to regulate everyone's speech. We're trying to say actually that when speech scales and reaches an audience because of systems you've created, then we've got a right to question whether that's responsible or not. With technology, it's developed so fast, but I think we've not yet kept pace. And looking in the UK, our most recent sort of communications act was in 2003. Our electoral law has not been updated to recognize that the internet is in any way, an important tool for communicating in elections. And I think that's where the failure is. It's not about saying we're going to appoint a Facebook censor in government who will tell Facebook what to do. It's about saying the companies have a responsibility, and it should be an independent body that determines whether they're meeting those responsibilities and obligations or not. And also seeks to define what we think hate speech is. We shouldn't just be leaving this, I don't think, to Mark Zuckerberg to determine what's responsible and reasonable, and what's not.

Taylor Owen: It’s been interesting to see how non-partisan this conversation has become. It actually seems like there’s a fair amount consensus on issues like disinformation, the spread of health misinformation, even foreign interference in elections. Now one example of this kind of cross-partisan consensus that seems to be emerging, has been the International Grand Committee which you were a part of founding. I’ve been really struck watching it over the past couple of years, that is seems to be a place where politicians from across the political spectrum, who all believe there’s a problem here. And they’re getting together to talk about how to fix that problem. Again, across partisan lines. Can you talk a bit about how this was founded, what your role in that was, and what you think the opportunity is here for this kind of international collaboration?

Damian Collins: When we started the Select Committee I chaired, the DCMS Select Committee, Digital Culture Media in Sport, for the listeners, when we started our inquiry on disinformation, we held the first ever Select Committee hearing of a British parliamentary Select Committee to be live broadcast outside the UK, which you did at George Washington University. And we asked Facebook, in particular, about Cambridge Analytica. And this was based on the article that appeared in The Guardian in 2015 about whether Cambridge Analytica got a hold of Facebook user data, and they denied that. And then about three or four weeks later, the scandal broke, the sort of Chris Wiley story, the big Cambridge Analytica scandal broke. What we then noticed, to our interest, was that Facebook was giving evidence in the parliament in Singapore on an inquiry they were doing about how to legislate, to deal with fake news and disinformation. And they question Facebook based on the testimony they'd given to us. And then, as the Cambridge Analytica affair developed, we became very interested in a company called AggregateIQ that are based in Canada, and uhm, but had worked on the Brexit referendum and had worked with Cambridge Analytica. And we were interested in asking questions about data and how they'd worked with Cambridge Analytica, and what they'd done in the context of the referendum. Now that's where our cooperation with the Canadian committee started. Where they were asking AIQ questions about an investigation that was taking place in the UK by the UK Information Commissioner. I heard the answers that were given. I knew that what they said, wasn't true. And I was able to text Nate Erskine-Smith on the Canadian committee to ask the questions. A message to say, as far as the UK Information Commissioner's concerned, that's not true and here's the reasons why. So we could see that when you've got committees that were actively investigating the same issues, that it was very helpful to compare notes, particularly as the tech companies largely try and give you the brush off on the hope you don't really understand the answers to the questions that you're asking. So therefore, the more you can improve your knowledge based on shared understanding of the way different parliaments were approaching this problem, that will be very powerful. And that was really how it got going. And I think it is really important where when you're looking at the big tech companies, is that whilst the local situations may be different, there are many common threads and themes, and to be able to cooperate on your investigations and benefit from the work other people have done is, I think, incredibly helpful.

Taylor Owen:

You mentioned Singapore's participation and this brings up a real challenge in this governance agenda, which is countries of varying degrees of democracy, and arguably even illiberalism, in some cases, are all using similar language to govern speech on the internet in different ways, right? With different core values. Do you see this as an alignment of democracies and democratic governance, and is that needed to push back against the competing tech infrastructure, which is the Chinese one, which is fundamentally illiberal, by design, how do you view that democratic spectrum that seems to exist amongst countries that are speaking to each other and collaborating in this space?

Damian Collins: I think it's really important that we have a model where citizens have got rights. And therefore, if it's one, if it's on speech, if it's determining what some hate speech is or what speech should be acted against, or downgraded in the algorithms of the companies, I don't think that should just be the judgment of the companies, nor do I think it should be the job of a government minister to do it. I said to the government of Singapore that I thought the approach they'd taken on disinformation was the basis of trying to have a mechanism of deciding what is disinformation or not, and therefore creating a legal liability to act. That's one thing in France, it's a judge that makes that decision. In Singapore, it's the government minister, and I said, I didn't think in, I think probably in the UK, we would say that'd probably be an independent regulatory body. It wouldn't be down to a government minister to make that decision, and I think that's really important that there's not politicians making these decisions. It's people who are separate and independent, can form their own judgment. What we need to do is create a system where citizens have rights over their data and rights online in the way that they do in democracies in other aspects of their life too. So, I think we do need to stand up and fight for that model. And certainly, as you rightly say, on disinformation, I mean, whilst it was very interesting to discuss areas of common interest with the parliamentarians in Singapore, certainly in terms of the solutions. I think they are quite different.

David Skok: That's in essence, what Mark Zuckerberg is going to argue in his testimony tomorrow, right? That Facebook is a patriotic company and it's Facebook versus China. Is he wrong?

Damian Collins: What I think is wrong, the narrative that comes out of Facebook sometimes, particularly from Nick Clegg, is that kind of, unless you let us do what we want, China's going to take over the internet, and we've got to be big and strong to stand up to them and therefore don't question us because, we're more on your side than China is. I just don't think, that's kind of choosing two sort of extreme positions and trying to sort of tell you you're sort of in the middle of it. And also I think Facebook behaves very differently in different parts of the world too. We've seen, I think, usually in the countries like the Philippines, you'd say, well, Facebook basically is the internet. And so actually that's hardly a model of a kind of pluralistic society. They are terrible problems with hate speech propaganda being used there. I think what we have to look at and judge Facebook on, in particular, is the way they operate, well, I think we should hold them to account for the way they operate around the world. And some quite shameful incidents, particularly the Facebook being used to organize atrocities in Myanmar. I think they should be held to account for things like that, but we should also hold them to account for the decisions they take in our countries too. And the decisions they take over regulating hate speech, on political advertising, on promoting extremist content, that's based on their policies and their decisions, and they should be held to account for them. That's, whatever your views on TikTok or the Chinese state, that doesn't mean to say you can't have a view on Facebook as well.

Taylor Owen: One of the things that seems to come up from the International Grant Committee conversations and some of the broader global governance conversations in this space. Is that there’s really a need for countries who don’t have the market power of the United States, or even the EU, to ban together in their regulatory approaches. Even just take the issue of anti-trust that’s on the agenda this week, I thought it was really interesting that the UK and Australia are collaborating in their anti-trust investigation of Facebooks purchase of GIPHY. Do you think this is a real opportunity here for small countries to ban together and collaborate in order to gain the market and regulatory power needed to push back against these global companies?

Damian Collins: Well, I mean, for sure. So I think you look at the resources of independent… of the regulators in these countries as well, and they are limited. I mean, for the UK Information Commissioner to take on Facebook is a massive undertaking, given it has lots of other jobs to do as well. So I think regulatory cooperation both internationally and within countries as well, I think will be really important. So, the competition and markets authority need to cooperate with Ofcom media regulator, and the Information Commissioner if they're looking at these issues because it crosses over too much. I think there's only a certain number of people that have the expertise to do it. So that is important. I think also something else which should come, I think, so often I think the big tech companies say we need a global solution. Now, often people say that when they want to delay any solution, because they know the chances of arriving at a global solution where we get the US, China, and the European Union all in the same place is kind of very unlikely to succeed. But I think we can break that down and say in the Western world, can we try and come to some sort of sense of common standards? I think we've reached a point now where we say we understand what the disruptive influence of technology is in many aspects of life. Lots of positives in that, some issues we need to address. It will be really helpful if we could come with a sort of common approach of both what we think the responsibilities and liabilities of the technology companies should be, and what oversight there should be on whether they meet them or not. And that would be enormously helpful if we could do that, or at least talk about how we would try and do it, but at the moment it looks like we've really got a kind of a conflict between probably an EU approach, which would be more interventionist and a US approach, which for the moment really gives very limited liabilities, if any, for the companies at all.

David Skok: You know, Mr. Collins, you mentioned that you started this process a few years ago and we weren't in a global pandemic at that time. During this pandemic in some ways big tech’s power has only increased. If you look at even the stock market, the stock market is being supported or held together by these companies. If you look at here in North America, the strength of Amazon and how it's really allowed people to go on with their lives. Is the public with you or is the public quite happy with how big tech operates? And how do you, as a politician who obviously has to think about that a lot, how do you reconcile that or wrestle with the public interest more broadly, and public opinion, more narrowly?

Damian Collins: So, I think, I don't think the public see it as an all or nothing question, which I think is sometimes how Nick Clegg would like to portray it as, which is that they think online retail is enormously beneficial. The fact that you can get stuff delivered fast, to your home, at a time when people couldn't go out shopping enormously beneficial. But I think people also care about the decline of the high street and the decline about retail shopping. Now, the consumer is not going to say well, part of that's down to their own consumer behaviour that has happened. But if we say, "Well, actually, we've got a very old fashioned system of business taxation, which makes it very tax efficient to set up an online business without a shop, very tax inefficient to have a shop which employs people in your community, and has a physical presence, and makes the place look nice. Should we do something about that? Or are we happy for this trend is to continue as it is?" It's not the fault of Amazon that that's happened, partly our fault because we've not updated our own policies to try and give some of the smaller businesses a fair go. But nevertheless, is that something that we should care about. If the consequence of the massive shift of ad revenue online is the collapse of the news industry and at localized level, is that something that we should care about? It's not necessarily the fault of Google or Facebook that that's happened. They didn't, it's not they set out to achieve necessarily, but nevertheless it has happened. So what do you do to try and rebalance that interest? Social media plays a fantastic job in connecting friends, families, communities, and so on, but it's also exploited to do bad things and spread hate as well. So I think that these are the issues. I think what the public perceive is they don't have a, any more than I do, a sort of sense of technology's a problem we've got to solve. I think what they see is there are certain societal problems that are occurring. Some of them, occurring within the tech space and they want to see a remedy to those problems, those individual problems. And that's, I think, what we're here to do.

Taylor Owen: What's at stake if we don't remedy those problems?

Damian Collins: Well, I think we lose control. I think in a way that we've never really accepted that sort of loss of control before. And, if we had an information world that was largely controlled by the algorithms of a couple of social media companies, which can easily be gamed by outside actors. People find it very difficult to have access to information that's accurate. We live in a world where no one really knows what to believe and doesn't trust any source of information. We have to say that that's massively harmful to democracy.

David Skok: Damian Collins, thanks very much for your time.

Damian Collins: Thank you.

Taylor Owen: Big Tech is presented by the Centre for International Governance Innovation and The Logic and produced by Antica Productions.

David Skok: Make sure you subscribe to Big Tech on Apple Podcasts, Spotify, or wherever you get your podcasts. We release new episodes on Thursdays every other week.

[ More Less ]
For media inquiries, usage rights or other questions please contact CIGI.
The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.