Taiwan’s response to controlling the spread of COVID-19 is one of the most effective in the world. In a virtual interview for the Centre for International Governance Innovation’s (CIGI’s) Global Platform Governance Network, Chris Beall, policy lead, platform governance, CIGI, speaks with Taiwan’s Digital Minister Audrey Tang about the key factors that helped Taiwan manage the spread of the virus so well. Minister Tang highlights a number of factors including civic technology, government transparency and trustworthiness, deliberative democracy reliant on community involvement and shiba inu humour. They also examine how Taiwan’s same approach to COVID-19 can be used to manage the spread of dis- and misinformation more broadly.
Transcript has been lightly edited for brevity and clarity.
Chris Beall: We're now more than a year since the start of the pandemic. And as we've experienced, a lot of countries have recently gone back into lockdown or really in Canada, for a lot of businesses, we've really never left lockdown. We're looking at almost a year in lockdown. You know, we've had some good news from the WHO. It looks like transmission rates are on a downward trend, but, globally, the picture still looks very bleak. Death tolls and transmission rates are still enormously high, really, except in Taiwan. And I'm really interested to hear your perspective as you look back over the past year, what the key factors were that enabled you in Taiwan to manage the spread of the virus so well. In addition to that, maybe as you're thinking about that, not simply initially, which I think some countries sort of managed the first wave well, but in fact that you've kept that going. You've kept that momentum over the course of the year.
Audrey Tang: I think the most important reason is that in Taiwan, we've all had a collective societal inoculation. People above 30 years old, remember how bad SARS was in 2003 in Taiwan, and right after SARS, while the memory was still fresh in society, we institutionalized those memories into the central epidemic command center, the design of the communicable diseases act, or the institutions that make sure that whenever it starts 2.0, which is how we refer to COVID-19 at this time, we will be able to play the SARS playbook without, for example, declaring a state of emergency without, for example, lockdowns, because we understand how bad it could be in SARS 1.0. So people collectively would be willing to do more like wearing a mask and washing their hands and say without the state having to do everything in a top-down fashion, which frankly speaking does not last a very long time without people understanding the why of it. The result of CECC - central epidemic command center - is that anything that's related to the epidemiology, whenever something gets discovered by the scientist, like the asymptomatic transfer and things like that, it gets broadcasted to all corners of society very quickly, very easily. There was this very helpful spokes dog though. A shiba inu explaining not only the science behind it, but the practical tips, like if you're outdoors you're supposed to keep two shiba inus away from one another or wear a mask; this is called physical distancing. Or, if you're indoors, then three shiba inus away and so on. So these messages go viral in the sense that people would voluntarily share it because it's very fun to begin with. So, by making sure that there's a positive mindset, a positive engagement, we can even make jokes about. For example, this is a classic shiba inu telling to you wear a mask to protect your own face against your own unwashed hands. This is hilarious, because the shiba inu is really very cute in putting their feet to their mouth. So people remember that and share that because this message ultimately appeals to rational self-interest. Instead of saying, protect the elderly, respect others, respect the medical workers, which are all fine, these types of messages don't tend to go viral. Whereas, protecting yourself from your own unwashed hands tends to.
Chris Beall: I really liked that element of humor brought into it. I'm curious though, if you don't mind pushing just a little bit on it. I do recognize how much you learned from the SARS experience, but what I found interesting is that in a lot of other countries that have also faced sort of those kinds of outbreaks, it doesn't feel like their reactions are the same. You know it sort of feels like there's something that Taiwan has done. I think that bringing in that humor element is super important. Something I've read that you've talked about is that instant transmission, right. That something comes and you respond immediately. That it doesn't need to go through layers of bureaucracy or layers of thinking before it comes out. And I guess really that comes down to an element of trust, right? The trust that you have with the citizens and then the trust that they have with government. And when you build that mutual trust, you know, it can kind of enable you to kind of carry that forward. I'm really curious around this question of trust and how you feel that Taiwan has created or sort of set the stage for that level of trust in government. You know, what we've seen in our network in talking to a number of different countries, trust is sort of an ephemeral thing, and it can be very difficult for governments to rebuild once it's lost. And it feels like at a short order, Taiwan has really kind of turned around its relationship with people. I don't know if you had any comments or thoughts about that.
Audrey Tang: Yeah. I tend to think of trust in terms of trustworthiness. And one part of trustworthiness is reciprocal. So before the citizens could trust the government, the government must first trust the citizens. And if the government trusts its citizens, it means that for example we're willing to publish the real time mask availability as open API - that's real-time open data - instead of having to review the statistics and publish every quarter or every week, we publish every 30 seconds. Each pharmacy, like 6,000 pharmacies, the people queuing in line, swiping their national health card, can see the real-time availability on their phone actually goes down by two at a time a year ago and nowadays it's by 10 every time. Then people queuing in line, if they detect any anomaly, they can just call the toll free number one, nine, two, two, and very quickly point out the problem with the system. And so, because of that, this participatory accountability, anyone who thinks that the government is not trustworthy immediately becomes a co-creator by calling one, nine, two, two, and pointing out something that we did wrong. And instead of defending existing policy, our minister Dr. Chen Shih-Chung, the commander of the CECC, usually just says teach us. So it's an invitation to co-creation. When a young boy called last April, to one, nine, two, two, saying "all my classmates have this navy blue medical grade mask, but all I get is pink ones. I don't want to wear pink to school. I'm a boy." Then on the very next day in the daily 2:00 PM livestream press conference, all of the officers, regardless of gender, wore pink medical masks and the commander even said that Pink Panther was his childhood idol or something. So the boy becomes the most hip boy in the class for he is the only one in the class that has the color that the heroes and heroes' heroes wear. And so this kind of immediate response is not about a blind trust to the government authority, but about the government trusting its citizens and amplifying the social innovations, the co-creations and so on, you know, very rapid iteration cycle. And I would argue that builds trustworthiness overtime.
Chris Beall: It's a really striking feature when reading about what you've done is this sort of co-creation element. And I really liked that it turns around that criticism on its head, right, because criticism is no longer a negative criticism just becomes a positive in terms of contributing to and helping society move forward. I'm wondering if you wouldn't mind talking a little bit about for, just for people who aren't aware, the vTaiwan, that sort of platform that you've built, because, you know, it feels in a way that, in addition to your preparation, because of SARS, the preparations you'd done with the hacker community, or kind of being prepared for the civic engagement, also contributed equally to your ability to respond to the virus.
Audrey Tang: Certainly. So vTaiwan is a project of the g0v, or gov zero community in Taiwan, which is a civic technologist community that contributes into participatory democracy online. And the idea of vTaiwan is that instead of having just a handful of representatives talking about emergent phenomena, like, Uber or crowdfunding or about teleworking or things like that, these emergent phenomena, stakeholders can actually represent themselves rather than having someone represent them, and using AI that's assistive intelligence, we can make sure that people's common feelings, common values, are given to air of the agenda setting power instead of just the polarized divisive zero-sum toxic behaviors that sometimes we see private infrastructure, so-called social, but actually anti-social media. So this pro social media, if you will, for deliberative democracy is really helpful in framing the conversation around, for example, sharing economy. In 2015, instead of debating endlessly about whether not carpooling is sharing economy, but timesharing is sharing economy. What really is sharing economy should they be called platform or gig economy instead and that's leads us practically nowhere, right? We instead talk about what is considered the norm when people drive around random strangers and charging them for it, and people understand that, hey, we can agree that insurance registration, making sure that the road is fairly used so that people do not undercut each other in unfair competition and things like that. These are the things that people commonly care about, regardless of whether they're a taxi driver or an Uber driver. So nowadays, Uber is a Twainese taxi company, actually the Q taxi, but we also revamped the taxi laws so that those so-called multi-purpose taxis can use a software meter and they don't have to paint their car yellow and things like that. So everyone wins and we have many co-ops and companies entering this multi-purpose taxi. And so this tells us that for each emerging phenomena, we can use AI and online conversation in a way that's actually pro democracy, as long as we see democracy as a type of technology that we're willing to change and hack ourselves.
Chris Beall: I have to say, I find that both the example and your retelling of it really inspiring. It's funny. It's one of those areas. We had a similar set of conversations here in Canada, around Uber and taxis, but somehow in these conversations, we didn't create that space for people to actually be able to share their feelings and to sort of understand every side. And I think that open space just feels like exactly what the technology should be used for. Right. It was a really interesting thing reading about your work and sort of following what's been happening in Taiwan. A lot of the time in the work that I've been been undertaking both in the Canadian government and now with the global platform governance network, when governments and civil society get together, and when we talk about the role of big tech and society, we talk about the role that technology plays, there's a tendency for the conversation or for the discussion to end up focusing primarily on the challenges brought about by big tech. We recognize the positives, I mean, the very fact that we can have this conversation and we can include people from around the world is an incredible benefit to us as we try and solve problems communally and together. But it really struck me, and hearing you speak now how positive you are about the role of society and, and how positive the role that tech can play in society's challenges, I'm interested to hear your thoughts because I think this is something, it almost hearkens back in a way to where we thought the internet was going to be. In this kind of way that we'd be problem-solving and community actions and committee activities. How do you think other governments can kind of recapture that spirit of tech optimism that we've lost and really what's the key from your point of view?
Audrey Tang: I think the key is to think of them as digital public infrastructure. Much as we have town halls and parks and public libraries and other public infrastructures where the civil society can gather and public deliberations taking place. As long as we have sufficient amount of these investments into public infrastructure, for example, the vTaiwan conversation took place on pol.is, which is free software, and as you can see, it doesn't even have a reply button. You can agree and disagree on my idea about "passenger liability insurance are very important" but you can't really attack me or troll me and, if you agree, you move closer towards me. If you disagree, you move farther away from me and after three or four weeks of conversation, and this is maybe the most important picture that pol.is can show you, is that's the divisive ideological things are like exactly maybe five, and there's a far more consensus statements that people's feelings that are shared regardless of their ideological positions. And there's a surprisingly large amount of coherence in people's common feelings that are just not amplified when you're deliberating on a private infrastructure, optimized for addiction and optimized for short term attention span and selling of advertisements. So, this is akin to holding a public deliberation, not on a public park, but rather on a nightlife district, a bar with bouncers and selling addictive drinks. And I don't think that deliberative quality will be very good, right? So I think the whole idea of public infrastructure, which we grasp intuitively when it's physical infrastructure, we need to lift it to the digital infrastructure and do what I call the people, public, private partnerships in which the people, that's the social sector, sets the agenda, co-governs the infrastructure. The public sector, the career public service, endorse the use of binding power only to those infrastructure that are publicly co-governed using free software and open governance. And finally, the economic sector participates in helping to scale this out, to scale this up, but always under the norms that are already set by the people and the public sector.
Chris Beall: It's interesting, I think you hit on such a key feature of this in your last statement. And I love that that kind of conception of people, public, private partnership, you know, sort of putting the people first, and that's really not where I've seen conversations around digital infrastructure, that it should be public infrastructure. Very often, it ends up being with sort of governments saying, well, we should build another internet, or we should build another website and then force people to use it. And then it sort of, it feels like that's really a non-starter, that's not going to achieve what we want to achieve, but as you say, having something designed by people then used based on free infrastructure really seems to be quite exciting. And I also really liked how we avoid the trolling. Cause I think now that that's become, and I think you're right, that, you know, so much of that is based around we're using for-profit infrastructure that's more akin to like a nightclub than a public park that is in fact driving divisiveness rather than driving consensus. I think that's a really quite exciting way to turn it around. You know it ties in actually to something we talked about in one of our conversations in the fall in the network. We were talking with a few of our colleagues in government and in research about how do we talk to platforms differently? How do we engage differently? And where do the challenges lie? And one of the things that one of our colleagues Rebecca Trimble from George Washington University recommended, she said, you need to have more data scientists in government. So you need to have more people who understand how this works and you need fewer sort of people who can be sort of hoodwinked by the digital platforms and sort of taken into their way of looking at the world. And instead have people who can actually explain this is how things work, and then they need to talk to the data scientists and the engineers within the platforms. And, you know, that's how you're going to find a different set of solutions. But when I listened to this idea of a people, public, private partnership, and for some of the things of yours that I've read, I wonder if you would say that we should almost flip that around. That rather than having the data scientists in government, what we actually need to do is open up more government to data scientists. I mean more, you know, sort of, let's not try and recreate their part of things, but actually let's have the community experts or the community engage and do their part. And then government do its part to amplify or support or populate with data.
Audrey Tang: Totally. In Taiwan, the open API directive says that whenever we collect any data that is not, of course, pertaining to national secrets or privacy issues, anything that is clear of these issues, we need to publish it. As soon as it's collected. This is actually quite radical because around the world under the freedom of information acts, it's usually first by request and also always reviewed first by the career public servants, before anything goes from the state to the people. But by saying that this open API, there's no way that the public servant can review and approve the publication of the medical mask availability data, every 30 seconds. They wouldn't be able to do anything else then, right? So we need to instead build a data pipeline that is very good on this - every security front, that is very good on the privacy preserving front and so on, and just keep the data pipeline running. And once you have that data pipeline, then you get, for example, in Taiwan, we have a legislator before joining the parliament, she was the VP of data analytics at Foxconn. So she knows something about data science - MP Kao Hung-An - and she analyzed along with the open street map community, the map of the mask availability and concluded that, even though it looks fair on the kind of satellite GPS map, it's actually unfair if you take into account the time you spent on public transportation in rural places. So, if people take like four hours to go to a pharmacy and the pharmacy closed, even though it looks on the map that is actually very close, it's not actually very close. So the government distribution algorithm is actually biased while it actually looks fair initially. When she brought it up in a parliamentary interpolation, because it's evidence-based, the Minister simply said, legislator teach us, right. The Minister doesn't have to defend any policy because it's obviously very good insight. And the very next day we started the pre-ordering into 24 hour convenience stores. We redistributed our supply and demand algorithm in conjunction with the open street map community and things like that. And so, yeah, exactly. As you said, then a data scientists as a legislator or as a social sector participant has as much as agenda setting power as Minister Chen Shih-Chung and the MP said the very next day that yesterday's interpolation becomes tomorrow's co-creation.
Chris Beall: You know in previous work, I've worked a little on the open government issues and was involved with some of the open government work here in Canada and other places. And that just sounds like a dream for some of my colleagues to sort of hear that. And it's really interesting because I mean, I think you're absolutely right that the freedom of information by request is the norm, but even more from having sat on that side of the desk in government, the tendency is to try and release as little as possible or else to flood with so much information as to make it unusable rather than to kind of think it just, as you said, is how do we actually solve problems together? And then how do we give the particular, the information that's going to be needed? I think that the solution of having it be open API is exactly the way to go forward. You know, and it does open up a question for me. So one of the things we wanted to talk about is to think about how do we take the lessons that you've learned through this experience and through the other experiences you've had and then apply them in the question in sort of that both foreign state interference and disinformation, and sort of, for some of our other network members, disinformation campaigns by extremists or others, and kind of learn from these examples. One of the conversations that happens a lot, and I'm sure you've been privy to these kinds of sessions, is there are some people that argue that the antidote to disinformation is more and better information. But it doesn't sound like that's all that you've got in Taiwan. That kind of the rapid response, the humor you've got some other sort of pieces in there. I'm curious, what would you say are the key building blocks that governments need to put in place to sort of help build that societal resilience, just as you talked about the community inoculation against the fake, manipulated misleading information. Is there anything more you can add on that?
Audrey Tang: Certainly. So, one part is a clear distinction between disinformation, which is intentional, and misinformation, which may not be. While misinformation can be countered quite successfully using cute spokes dogs and humor over rumor and things like that. Disinformation when it's intentional and backed by a lot of money, sometimes by foreign states, that's not as easy to counter using humor. One case in point is that in the 2018 local election, we actually saw a lot of money poured into say Facebook and other social media with precision targeted advertisement that are opaque and that could not be attributed back to who gave the money. [We] compared that with the campaign donation and expense, which must be filed to the national auditing office and is always restricted to domestic donors only. So it's almost like through those social media platforms, they found a way to bypass the democratic oversight of campaign donation and expenditure. Now, interestingly, the solution did not originally come from the government. It came from, again, the g0v community. There's a project that asks volunteers to go to the national auditing office because at the time they were already publishing the expenditure and donation information, but only on paper. So they took out the paper, scanned the paper did OCR and computer vision. But the OCR is called optical character recognition because people see like a capture each single cell of these large spreadsheets and compete with each other quickly turning these into digital versions. So there's almost like civil disobedience of the past elections really put a lot of pressure on the national auditing office because the office may say, Hey, you are not absolutely sure these crowd OCR version is the correct one, and the civil society can simply say, so that's why you should publish this open data. So finally the legislator saw the light and asked the national auditing office to publish as open data. So investigative journalists can do the analysis, which they did for the first time in 2018. And, lo and behold people see that, almost nobody filed those social media advertisements as campaign donation or expenditure. So it's clearly a bypass. So we turned around and talked to Facebook and friends saying, look, this is the social norm. This is not like we're passing a law or something against you, but people have already put a lot of pressure on the national auditing office and now they're turning this attention to you. So it's like a trade negotiation, right? We can talk to Facebook saying, so this is not a state request. This is rather a analysis of the societal temperature. And if they do not publish at least to the same kind of open data in real time their advertisement library when it pertains to political associate issues for independent analysis and calling out dark patterns on the next election, they may force social sanction. And this force of social sanction is not initiated by the state, but rather by the people, right. And that's why Facebook in 2019 madeTaiwan, I think the first jurisdiction where they published the entire advertisement library under open data, actually as an open API, so that by our presidential election, there's no such dark patterns that happen on Facebook and so on. So that's a successful negotiation.
Chris Beall: Yeah. And I mean, as you say, it's very different experience than I think other countries have had in dealing with the social media companies, but it sounds like as part of it, just as you said, when it's driven from the community and it's driven from that community-based conversation, it's less about governments coming in and saying, we're going to restrict what you do. We're going to drive you towards a certain end. It's more just let's engage and let's come up with a common set of solutions. It was difficult coming into this and, and sort of looking at areas where, things had not gone well. It feels like looking at not just the pandemic, but a lot of these other challenges that things are going on a really good track in Taiwan in these areas. But, you know, as we're looking at our government centers, we're thinking about these things. One of the things that I think we sort of realized is governments, they tend to be afraid to take risks. There's a huge avoidance factor that comes in. And so there can be a real fear about trying new things. One of the challenges that I've experienced in the space is that at the same time as we're afraid to do new things, we learn the most when things don't go well. I mean, it's those failures that actually teach us, right. That's good to have those experiences. And so well, justifiably, there's been a ton of talk about your success in the public response to COVID and of course, in these other areas as well. Are there other areas where you think, some of your experiments didn't work? I have a little bit of a sense in that kind of the way that when somebody comes into the criticism, it just sort of builds it into the solution is that it's a constant iterative development. And so there are no failures in that kind of sense, like you're sort of constantly failing.
Audrey Tang: Definitely. So I'll go back to the mask map example because internationally it is being reported as a huge success, but it's actually a huge failure on the first day of launch. February sixth in 2020 when the mask availability map was first produced in the pharmacy version, that is, a lot of pharmacists concurrently invented something new - the take a number system. So this social innovation is basically instead of handing out the medical mask in return of swiping their national health card for the customers, they instead took the healthcare cards from the customers and asking them to come back. They processed the IC card in the pharmacy during the lunch break and asked the customer to return for example, at 7 or 8 PM or something, and collect the mask and their IC card in exchange to the number of cards that they gave us. So all this pharmacist social innovations actually are at odds with the mask availability map, because if you analyze, it's like, if you're an ex-Foxconn data analytic person, you will see that this particular pharmacy didn't sell anything until noon and then at noon in a very short time span, like 10 minutes, they sell everything in their stock. And this kind of, real-time API is actually misleading, not useful at all. And one of my nearby pharmacies even went to the lengths of putting very large banners, posters, on their front door, on the glass window saying, "Don't trust the app", exclamation mark. And so that of course is a spectacular failure, but the way that this is easy to recover from these failures is simply saying, "So we didn't anticipate it. We apologize, and we're running a weekly iteration. We're using agile development." So by the next delivery, that's next Thursday, it's always next Thursday, anyone who comes up with ways to fix it, we will implement that fix. And so a pharmacist, said, okay, why don't you publish your open API two different times slots instead of one; one for collecting the cards and one for collecting the mask. Why not right? So the pharmacy that used this take a number system can just register it like 7 to 9:00 AM and 7 to 9:00 PM, in two time slots. So we did that the very next week, but the nearby pharmacy still didn't take that poster down. And so, I took a deep breath. I walk in and I ask, Oh, so why? And they're like, actually their cards run out by, for example, 8:00 AM. So there's one whole hour, where they're on the map, but the number was inaccurate. And so I'm like, okay. So if you are the minister, what would you do? That's the favorite question I would ask people and they consulted their pharmacy group. The very next day they told me that if they used the backend system to input that they have sold a lot of mask, more than their stock, then they get into negative availability and that made them disappear from the map. So it's a hack, almost like a white hat hacker thing, because we didn't check for the signs in the field. But that's a workaround and that enabled them to take down the banners. So I went back, talked to the national health insurance agency about this innovation, and they formalized it and invented a button that any pharmacy can just push it and disappear from the map like the cloaking device. So that's after more than three weeks, more than three sprints. So I think the most important thing here is that each failure can be turned into an opportunity of co-creation, but if we say we'll fix it, but actually fix it only like one month or two later, then people that don't have that kind of patience. It really has to have a really short iteration cycle, like one week or at most two weeks for this kind of co-creation to work.
Chris Beall: Wow. That's amazing. It really makes me think, earlier this year we had a conversation about how to maintain horizontal efforts given sort of, you know, traditional vertical accountability structures in government, right, is that, you know, so often in a government structure, you have an analyst reporting to a manager reporting to a director, sort of, it goes up to a Minister and then down, and it can be very difficult to kind of work cross purposes. And yet what we're, what you're talking about, or what you're describing is the perfect example of how working across those areas of bringing in the public, of thinking the sort of different roles and each time there's a mistake or a problem, or a challenge learning and fixing it. And if you don't mind me asking, as the Minister coming at this on this cross-cutting perspective, do you find this a challenge? Or how does one get past that challenge of this sort of vertical accountabilities and the responsibilities of, you know, sort of people saying, but I have to report to this person, I have to work on this issue?
Audrey Tang: Definitely. I think the diversity and inclusive culture is the most important. My team, the public digital innovation space is around 20 people, more than half of them career public service. We very intentionally only allow one secondment from each ministry. So the more than 12 ministries that have, since come into my office, understand that they cannot send two people at the same time. They have to rotate. And the reason is that we want a fresh perspective every time a new person comes to our team. Everybody can learn from them because they are the only one with that particular perspective. And once you have such a cross cutting maximally horizontal diverse team, then people are willing to share what they have launched with the community with like working out loud as a center ethos because there's no, like sub ordinance, in any particular sense because everyone is belonging to every different ministry, right? So no matter what their ranks are, people work as peers in my team but they, of course, they report to their own minister. They still take whatever they learned from their peers back to their ministry and they themselves, go back to a ministry after a year or two, then someone else rotates in. The trick here is that by working out loud, this culture, even after they go back to their ministry, is still permeating almost by osmosis to their ministry about how horizontal structures, instead of fighting vertical structures, actually reinforces, like a bridge that connects those vertical pillars, reinforces the idea of resilience. Whenever any particular pillar makes a mistake that may make actually open a very good learning opportunity for the other pillars in the same office that the same idea of biodiversity, by the way, and so because of that, we have collaborative led culture that's willing to take risks because what's risky for one member is actually not risky for other members. And that sort of HR policy I think is also very important.
Chris Beall: Thank you so much. It's a brilliant answer. And in fact, it's exciting to see how you're hacking government. Rather than trying to deal with, how do you deal with their vertical structures in your horizontal structure, it's more like encouraging them to see the benefit of horizontal by passing out the seeds of your efforts into the biosphere of the government work. This has been so wonderful of you, and I really appreciate all of your time. Did you have any final thoughts or questions you want to share with us as we wrap up? Certainly no need to, just if there's anything else on your mind that you wanted to share, I'd love to love to hear from it.
Audrey Tang: Yeah, certainly, since the theme today seems to be turning mistakes into co-creation opportunities, I'll just conclude by quoting my favorite poet, singer-songwriter, and Canadian, Leonard Cohen and diverse anthem that said "ring the bells that still can ring. Forget your perfect offering. There is a crack, a crack in everything, and that's how the light gets in." So live long and prosper.