On February 5, 2020, CIGI President Rohinton P. Medhora addressed global leaders at a workshop titled New Forms of Solidarity: Towards Fraternal Inclusion, Integration and Innovation. The event was hosted at the Pontifical Academy of Social Sciences in Vatican City, and explored the ethics and financial mechanisms that are necessary for building solidarity among institutions and countries. The following is a lightly edited version of his remarks.
I’ve been asked to reflect on big data and new technologies, and how they work in the context of this conference. I’d begin with a few context-setting points and then end with what we might do to improve the situation. And I’d begin by saying that data is actually part of a continuum, and we’ve always known this. Data is a continuum that goes from data to information to knowledge to wisdom. And for this continuum to work well, two things are at play. One is the technology that links them. And the second is values. Now, whether you call it values, ethics or morality, as Joe Stiglitz pointed out, technological change and how we transform ourselves is not exogenous. It is, in fact, something that is endogenous and is, or should be, driven by a value system.
There is something about new technology that perhaps breaks that chain, and there are certain characteristics in the digital economy that might be worth flagging. Digital forms tend to face high upfront risk and high fixed costs, but once you have it made, you really have it made, and marginal costs of production often tend to be zero. This also gives an advantage to first movers, and it prioritizes strategic behaviour. It also means that a lot of the profits that are made are effectively monopoly rents. And so this is the environment — the economic logic — that is compelling firms to behave the way they do.
If you think about data and how it fits into this model, the fact is data comes with different imperatives. If I’m a national government and I want to think about how we deal with data strategy, it seems to me that there are at least five aspects that we should be balancing. Obviously, data has the potential to create immense amounts of wealth, and that’s not a bad thing because we need that wealth to do social good. But at the same time, data has characteristics that make us value our privacy. They make us prioritize public security. As we’ve learned in the last few years, data is imperative if we want to preserve our open society and have a healthy democracy. Finally, in many ways, data infrastructure works the same way that physical infrastructure did generations ago. Just as with railroads and broadcasting and so on data infrastructure is part of the nation building and social fabric-building consensus.
It’s not surprising that we find that the world is effectively, unfortunately, balkanizing into three zones. You have the state-centric China zone, where, effectively, you cede data to the state. Its counterpart in some ways is the US zone, where it is big firms to whom you have ceded that sovereignty. And in Europe you have the GDPR [General Data Protection Regulation], where it is, in principle, person-centric. And there is a host of countries — in fact, the majority of the countries in the world — who lie outside these three zones. I think the question going forward is how do we have a global data zone that balances all of these exigencies while valuing that each country at any given point in time might want to balance these differently.
And so I’d begin with the main, the overarching point. Joe Stiglitz mentioned the Ten Commandments. More recently, we have the Universal Declaration of Human Rights [UDHR]. I would argue that we need something similar for new technologies, that we need a broad moral and pragmatic statement that will guide us all in how technology is created and how technology is used. Now, if I use the UDHR as the example, you might say, “Well, what’s the point? There are lots of countries who sign on and have no intention of abiding by its tenets.” I’d say we still need one because it is aspirational. Having aspirations is a good thing. Having a clear moral statement is a good thing. And people across the world look to these statements, even and especially in societies where they’re not being valued.
Second, this kind of statement actually guides national and subnational legislation, so it has a practical value as well. And third, as we’ve seen with the International Court of Justice, over time, this kind of broad statement can be used to name and shame and ultimately prosecute people who go against its ethos. So I think there’s a good case to be made for why we should have a broad global statement, à la UDHR, on new technologies or simply on how we use big data.
"...we need a broad moral and pragmatic statement that will guide us all in how technology is created and how technology is used."
From that flow a lot more of what I’d call important but pedestrian issues that we’re tackling —perhaps not deep enough and not fast enough — but let me mention four or five. I mentioned at the outset that huge amounts of wealth can and are being created by new technologies. This wealth is awfully concentrated because the process of innovation is mostly driven by proprietary intellectual property. The logical response through the centuries has been public action. You tax wealth and use it for the public good. That imperative becomes even stronger going ahead than it has been in the past. The attempts to harmonize taxation, particularly of digital firms, to understand that monopoly rents have to be treated as such and treated differently, is something that we should pay increasing, and indeed, multilateral, attention to.
Second, innovation tends to be concentrated in a few parts of the world and a few regions within those parts of the world. How innovation is diffused is something we haven’t paid enough attention to and in fact, the only global framework we have on this, TRIPS [the Agreement on Trade-Related Aspects of Intellectual Property Rights], goes back at least two decades. As we think ahead to all the potential good that technology can do, we need to revisit that framework. We’ve done it in ad hoc ways during the HIV/AIDS crisis. We had compulsory licensing. We do have mechanisms, like advanced market commitments. We have the grand challenges approach. But we do need to revisit the way innovation occurs and, especially, diffused, because if you think of something like climate change, if there is a pathbreaking technology, it works best if everyone adopts it instantly and really well. And that goes against the tenets of the way technology is currently diffused. We need to revisit why technology is created and how it is diffused.
I’d also make a point again in the spirit of saying that the use of technology is not exogenous. If I created a slightly better headache pill, it’s going to take me a good seven to 13 years to get it to market. If I create an algorithm that improves facial recognition that might lead to a better cure for acne or something socially much better or much worse, it pretty much happens instantly. And so you have to ask yourself, if we use an FDA-type model in some parts of the health industry, why don’t we have it more broadly? Algorithmic accountability, algorithmic ethics should be part of the public policy framework and not seen as outside it. In the same spirit — because as I said, the economics of the data-driven digital world drives predatory behaviour — foreign investment is something that increasingly countries around the world, including mine, Canada, are going to have to look at, because when you allow foreign investment in the digital context, you find that it is your data that is at play, it is your workforce that is at play…and there’s a range of issues in which work is done here, but the IP and the wealth created is there. So we have to revisit foreign investment as well.
And the final set of points I’d make on this, because I was also asked to think more about multilateral and international cooperation, is that the global system has been sclerotic. On the other hand, it does eventually respond. We do have now something called the Financial Stability Board, which is an attempt to understand what went wrong and not have the crisis that we had a decade ago happen again. We might want to think of an equivalent, and my colleagues at CIGI have talked about this — a Digital Stability Board — something that promotes best practice in the digital realm, that analyzes how data is used, how it’s combined, and has the same set of principles that make the digital sector as resilient and problem-free as we would like the financial sector to be.
And so I’ll end on that note: that these points I made are not outside the intellectual or ethical frame that even mainstream economics teaches, but as Rob Johnson sitting next to me said, “How it’s taught has kind of gone astray. How it’s propagated through the media has gone astray, and what we need now is the will and the ethos to bring it back together. ” Thank you very much.