Geoffrey Cain On China’s Dystopian Surveillance State

Season 3 Episode 18
Geoffrey Cain on China’s Dystopian Surveillance State

China has created a technology-driven Orwellian surveillance state, where AI pre-crime systems and social credit scores can decide your fate.

S3E18 / July 22, 2021

BT S3E18 Guest-Headshot-1200.png

Episode Description

In this episode of Big Tech, Taylor Owen speaks with Geoffrey Cain, author of The Perfect Police State, about the technology enabled in China’s Xinjiang region to oppress its Uighur population. Through a network of surveillance systems, social credit scores, algorithm-driven pre-crime computer software and a society where people are now fearful of their neighbours, China has built a chillingly real Orwellian police state. In their conversation, Taylor and Geoffrey discuss how these technologies are used to identify, detain and brainwash the Uighur people in what may be the first genocide in history driven by big data and artificial intelligence (AI).

However, one of the most powerful aspects of this surveillance system has nothing to do with the advanced technology. “It’s a crude system that is designed to keep people on their toes, to, you know, turn them against each other. … If my good friend might be snitching on me, well, I’m going to snitch on him first, and hopefully he’ll be taken to a camp. And then my ranking with the government rises and maybe the AI and the computer systems won’t take me away,” Cain explains. With no transparency in the system — no one knows how it functions or decides who’s a threat, or what computer code might determine their fate — an entire population lives in constant uncertainty and fear.

Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

Geoffrey Cain: This is not just a story about one minority in China. This is a story about humanity. This is a story about what happens when we put bad technologies in the hands of bad people who have bad intentions.

Taylor Owen: Hi, I'm Taylor Owen, and this is Big Tech.

The plight of the Uighurs is one of those stories that's never really left the news cycle, but still somehow hasn't fully captured the attention it warrants. Maybe this has because it's notoriously difficult to do investigative journalism in China, or maybe it's because Western business interests are so entrenched there, or maybe it's because unlike past atrocities, images of violence aren't filling our screens. Whatever the case, we're certainly not paying enough attention to what the people of Xinjiang call 'the situation.' The largest internment of an ethnic minority since the Holocaust, something US State Department and a number of international human rights organizations have called a genocide. If you're looking for a primer on 'the situation', look no further than Geoffrey Cain. Geoffrey spent three years interviewing Uighur refugees, Chinese tech workers, and government officials. The resulting book, The Perfect Police State, is a window into an Orwellian dystopia. His depictions of the Chinese police state are eerily similar to descriptions of Nazi Germany, or Stalin's Soviet Union. With one obvious difference. This might be the first genocide in history that's been enabled by big data and by artificial intelligence. Like my previous episode with Hong Shen, this conversation should be seen as a component of the wider, complex political economy of Chinese technology. Hong made the argument that the Chinese firewall allowed Chinese tech giants to become the economic powerhouses they are today. Geoffrey focuses on the other side of these companies' growth. He shows how they help build a sophisticated surveillance state capable of monitoring and shaping the behaviour of hundreds of millions of Chinese citizens. This model is now being exported to a illiberal-leaning countries around the world, which means that the dystopian reality that the Uighurs are living in, is not just a human rights atrocity, it also presents real challenges to democracy itself. Here's Geoffrey Cain.

In Xinjiang, they talk about the surveillance state that was billed as 'the situation.' How would you just very generally and broadly characterize this situation?

Geoffrey Cain: The situation that's arisen out of what you could say is China's war on terror. Throughout the 2000s and the early 2010s, there was a span of about 15 years when there was both growing inequality and discontent within the region of Xinjiang. It was this frontier of China where the Han Chinese, the dominant group would be bused out with the encouragement of the government. They would develop the region, build high rises, look for new jobs, find oil, build the railways, it really did have that gold rush sense to it that this was going to be the future. But in the process, many local Uighurs and Kazakh Muslim peoples were displaced from their historical homeland. They couldn't get jobs. There was discrimination. This two-class society emerged in which either you are the majority Han Chinese, or you are the minority Muslim Uighurs or another ethnic group there. I mean, this is not necessarily unique to China. This has echoes of what happened in the American West with the Manifest Destiny, the removal of Native Americans from their lands. The South Africa with apartheid, but what makes Xingjiang unique and different from these examples is that China has innovated a 21st century form of apartheid based on novel technologies, such as AI and facial recognition, big data gathering, surveillance cameras. So all of this data, the Uighurs people in Xingjiang are being monitored 24/7. Data is being scooped up on them, sent back to the police, sent back to the intelligence party headquarters in the region. And there is an artificial intelligence system that processes all this data it's called the IJOP, or the Integrated Joint Operations Platform. It is a system that through a mixture of just data gathered on everyone, determines if they are at risk of committing a crime in the future. So this is pre-crime. If this AI system determines them to be a future threat the police are sent in and a Uighur person will be taken away to a concentration camp for a future crime that she or he will commit. The latest estimates are somewhere between maybe 300 and 800. There are at least 350 documented camps up in the mountains of the region, they're often repurposed high schools or government buildings, gymnasiums, that sort of thing. Every single day from early morning to late at nights, the Uighur people in these camps undergo a never ending series of brainwashing and indoctrination rituals in which they're psychologically tortured, forced to deny their own realities. The effect all this has is that the goal is erasing the identity of a people, just simply doing an identity wipe. They forget who they are, they deny their own history and heritage and culture. They're no longer Muslim, they're no longer Uighurs, they're simply going to take in the propaganda of the states and become minions of the state. That's the end goal of this. It is surprisingly effective. All the Uighur refugees I interviewed who had either been in camps or had family members in camps or friends, almost all of them remarked that the people who were in these camps were like these car crash patients who woke up with amnesia and they didn't know who they were, where they came from. It was just simply a blank slate.

Taylor Owen: I want to get into some of the details of how that experience, but also how it's being operationalized. But you do write a bit about some of the rationale for this. Just to give that context, I mean, it seems like it was put in place after 9/11, perhaps opportunistically thinking that there was a moment where counter-terrorism could be used as a justification for all sorts of things. And not just by China, other governments have done that too, obviously. But to what degree is this a counter-terrorism narrative? Is there any truth to that?

Geoffrey Cain: Yeah. It is a counter-terrorism narrative actually. I think that there is a great truth to that. Just to give some historical background. Under the Communist Party in particular, the Chinese authorities always looked at the Uighurs and Xingjiang with suspicion. It was a potential breakaway state. It was its own unique culture, language, history that was separate from other parts of China. So the authorities were always concerned about separatism. But it wasn't really until 9/11 with the twin towers that China started kicking up the rhetoric and the intense repression on the basis of counter-terrorism operations. Actually, the US government has some culpability in helping create the situation there. Because at Guantanamo Bay, there were 22 Uighur captives, they're called the Uighur 22. The US had brought them, had captured them in Afghanistan and Pakistan. They were combatants to some extent, but there was really not much evidence that these individuals were setting up an actual jihad that they were going to wage against liberal democracy and its forces. But the fact that these people were in Guantanamo, there was no charges brought against them, like all Guantanamo prisoners. This was a huge boon for what China was trying to do because now it had the evidence like, look, they're in Guantanamo-

Taylor Owen: They must be terrorists.

Geoffrey Cain: They must be terrorists, and see America supports us too. We in China we have this Uighur threat and we have to take action. This laid the groundwork for the justification and the rhetoric of what would later become 'the situation' in Xingjiang.

Taylor Owen: Yeah. Wow. I mean, you think of the long-term repercussions of that global war on terrorism framing, and man, this is a visceral one, isn't it? For most of the book you follow a single person, a woman named Maysem. Can you tell us about her and what was she like? How did she experience this dystopic future that you're talking about?

Geoffrey Cain: Yeah. Maysem was a young woman, when I met her, she was in her late 20s living in Ankara, which is the capital of Turkey. She was a refugee. She had just arrived in Turkey when I met her. She was overcoming all kinds of psychological effects, depression, anxiety, that the Chinese state had imparted on her as she left. Maysem was just brilliantly smart, a literary intellectual who had read vast amounts of books and poetry. Truly just a cosmopolitan woman of the world who had these dreams as a teenager of becoming a diplomat to represent China. She came from an elite family, a deeply patriotic family. She had graduated near the top of her class, had attended a very good university in Beijing, the capital, and really had things set out for her. But due to just the high nature of her intelligence and curiosity, the fact that she liked to read books, the fact that she was traveling, she had been studying for a master's degree in Turkey and lived in Turkey and returned to China every summer to be with her family. Just those facts was enough as the surveillance state was set up for the AI system to locate her and to start watching her and to determine that she was someone who was not trustworthy, who might be harboring terrorists thoughts and who needed to be acted against and taken to indoctrination lessons first.

Taylor Owen: So the AI identifies her. We'll talk about how they did that in a moment, but what started happening once she was flagged?

Geoffrey Cain: A government minder started showing up at her doorway every day and sometimes twice a day, and would knock on the door, would take a look around the home, ask her questions. "Where were you? What did you do today? Did you see anything suspicious?" And then there was this little barcode on the door and the minder would scan it to suggest that she had done her daily inspection and could go. One day this minder showed up and told her that there was an order from the police station that someone had either reported her or found suspicious activity related to she was doing with her life, didn't skip details and handed her the instructions to install a government camera that would connect to local police stations in her living room. It was recording 24/7, watching them. This was devastating to Maysem because it essentially meant that all the private conversations, all the books that she was reading, that she could no longer read them. She could no longer do what she loved until she went back to Turkey. So this camera is there for a little while, I believe it was maybe four or five weeks or so. And then finally the local police said that, "You need to report here every few days, you're going to do it an indoctrination lesson." They call it civics class. She would have to go for interrogation. She would sit there in front of three police officers who sit at this one desk in front and they just drill her on repetitive questions over and over again. Like, "Why do you go to Turkey? Do you know any terrorists in Turkey? Do you read the Quran? Are you Muslim? What is your favourite Quran verse?" And they would ask the same questions over and over in different forms just to throw her off and to make her paranoid and to make her nervous. And it's like, how do I answer? What if my answer changes? And then finally, after these gruelling interrogation classes, she was finally ordered just to go to a series of concentration camps, two concentration camps actually. First to one that was a lower level reeducation centre where she could leave at 6:00 PM every day. But then within a few hours, she was kicked up to a top security compound, which is called a detention centre, and she was forced to stay there and to undergo psychological and physical torture to erase her identity.

Taylor Owen: What did she go through there?

Geoffrey Cain: On her first day she was required to get in a tiger chair, it's this chair that contorts your legs and arms. It's very uncomfortable to sit in for more than a few minutes. The guards just put her out in the courtyard and left her there until her sun burns. And then they would do things like, they would take her out of the chair and then make her stand still say on one leg, or lift up her arms, lift her up her arms like this and just stand still for 30 minutes to an hour. An officer would stand behind her with a baton. If she moved or got uncomfortable or stopped, put her leg down so she could stand on two legs, the officer would just start beating her until she got back in place. They would do this, they would say like, "Oh, if we hit you, that means you have to start over." So they start the clock over, you got to do another hour, stand perfectly still in a weird position. I mean, this would go on and on until finally she could do it. I mean, that's physical, but there was also a lot of gaslighting, I guess what we would call in the West narcissistic gaslighting, and the denial of reality. This is one of the more bizarre tests that I've heard, but they would put out a table for students. One side of the table would have a home, it would be like a model of a home and a yard, and you're supposed to rearrange the car and the house and everything according to what it would actually look like. And then next to that would be another table that would have an AK-47 and a grenade and a pistol and a rocket launcher. They would tell you, you have to reorganize the rocket launcher and the assault rifle so it looks like it's in the correct arrangement on a table.

Taylor Owen: Little toy versions, right?

Geoffrey Cain: It's like, well, what is the correct arrangement of an assault rifle on a table, a toy assault rifle? The secret answer to the test is that if you even touch them and try to move them, that means that you're comfortable around weapons and therefore you're a terrorist, and they would put you in solitary confinement and torture you. This is one example of the extreme psychological torture tactics that they use, where it's like... For Maysem it was very much a riddle, and it was, we don't know where the line is drawn, what is permissible and what's not. Her story in the camp is going through this riddle and trying to figure out where is she crossing the line and doing something unacceptable? It's never really a clear answer, and that's how they drained you. That's how they indoctrinate you.

Taylor Owen: How was she able to get through this and then ultimately escape China?

Geoffrey Cain: It turned out that there was a bureaucratic loophole. One of the things you got to understand about, I call the book, The Perfect Police State, but the irony of this perfect police state is that it has a lot of flaws and loopholes and imperfections, bureaucracy, that ironically make it perfect. Because if the system is imperfect, that's why people are unsure of what to do. They don't know what's going to get them in trouble. They don't know -- even the AI system, they don't understand, how does it come to conclusions about people? But in Mason's case, being the intelligent woman she is, she was able to locate some of these loopholes and exploit them in a way, with the help of her mom to get out. I mean, you got to understand just how lucky she was with that timing. If she had stayed any longer, she probably would still be there. But it was just really good luck, good timing, being intelligent, knowing the right people, that got her out when she did. After this, just to finish off here. In late 2016, right after she had left, the government began ordering Uighurs to report to local police stations and they would have to surrender their passports. So it was impossible, almost impossible to get out. So she's very lucky.

Taylor Owen: You mentioned the crudeness, almost, of the infrastructure and it really struck me reading through this. This narrative of it being a perfect police state that can target with high specificity and accuracy. Actually seems fundamentally off. It isn't that, but it might be perfect for a genocide.

Geoffrey Cain: Exactly.

Taylor Owen: It's very good at doing what it does, but it is not targeted. I mean, if it was actually looking for terrorists, she would not have been included in this targeting. It shows maybe the intent is different than the characterization.

Geoffrey Cain: Exactly. That's exactly what it is. It's a crude system that is designed to keep people on their toes, to turn them against each other. If there's a divide and conquer here, if my good friend might be snitching on me, well, I'm going to snitch on him first and hopefully he'll be taken to a camp. And then my ranking with the government rises and maybe the AI, the computer systems won't take me away. Maybe I'll be okay in the end. But that's rarely how it works, because even if you snitch on your friends, the AI is just going to find you guilty anyway. Even if you're a member of the party and you have good standing with the government, none of that matters anymore. It's just anyone can go for any reason. So it works.

Taylor Owen: It works at that objective, which makes me wonder or be even more concerned about our adoption of similar technologies for supposedly much more precise policing operations. If they're actually designed for much broader, cruder, psychological warfare purposes. Can they even be calibrated for the purposes with which we are using them or claim to use them now?

Geoffrey Cain: Yeah. Our law enforcement agencies in the West are deploying the same technologies. I mean, we're using facial recognition, we're using AI to predict, who's going to commit a crime, neighbourhoods that might commit a crime. But the problem is that there's not much transparency around how these systems work. Police departments by their nature are not transparent, and we don't know if they're gathering good data. I mean, is the data even that worthwhile? Is this garbage in garbage out? Is the software that good? The software is proprietary. I mean, we can't lift up the hood and check like, is it actually doing its job? There are lots and lots of questions over whether this technology is actually useful in the US.

Taylor Owen: I mean, I guess broadly you described three components of this perfect surveillance state. Can you run through what's needed to do this on the tech end?

Geoffrey Cain: Yes. This is something that, I guess I didn't analyze this myself or think of this myself. This is actually something that a Uighur friend told me, who is still in Xingjiang and I really hope that he's been okay since I last saw him in December 2017. There's no way to know. But he told me upfront what was going on. There were three stages that he observes. One stage is that you need to find a way to, I guess, you could say divide and conquer your people. You need to first find a way to surveil them in a way that breaks down trust. That could mean putting out fake news, social media, anger. Just look for cracks in the population and drive wedges into those cracks so that people no longer want to work together, and they're angry and hostile. They're worried about what's happening. Then you want to find ways to exploit private companies that are either, A, they're worried about their profits. They're having hard times, they're in debt. They're investing a lot of money in future tech that might not come to fruition. Find ways to subsidize those companies or work with them or help them. And then on the flip end, if there are companies that are doing well, they're going to be complacent, they don't want to mess up the success they built. They'll work with the authorities if the authorities come knocking and say like, "Look, we want your backdoors into your iPhones or your smartphones," or that sort of thing. So co-opting private companies, making them essentially arms of the state, is another popular tactic. And then the third stage is when the government creates what this Uighur person told me is the panopticon. The idea is that everyone knows that they're being watched, but they don't know when or how or why or with what methods, but since it's a panopticon, I mean, nobody knows for sure-

Taylor Owen: If it's working or not. I mean-

Geoffrey Cain: Yeah, if it's working or not. You never know. For example, is there a... There's this talk of the human behind the AI curtain. Like the Wizard of Oz, you open the curtain and actually it doesn't work at all, there's just a human there running it all. It could be just that a police officer is sitting there and says, "Well, today I'm going to look at Mr. Wong and what he's doing." And he's just following your internet connection and looking at your messages and that's that, and then you're going to jail.

Taylor Owen: That's classic Orwellian surveillance-

Geoffrey Cain: It is.

Taylor Owen: Nobody knows if they're being watched, if everybody thinks they are. In order to do this though, it seems like a number of technologies needed to come to development. There's a technological capacity that wouldn't have existed 10 years ago to do what's possible now. I mean, you describe it as having this data component that comes from the control of social platforms and all the communication that happens, and information that's shared on social platforms, as well as the data that can be captured through surveillance cameras and networks of surveillance cameras. You also needed technology to make sense of all those data, because this is obviously, if you're capturing all conversations of all people in a country of over a billion people, that's a lot of data. So you need the AI and in particular the facial recognition technology, audio recognition technology built on AI to make sense of it. And then you need this whole layer of interpreting it, that was this whole social credit system, this scoring system that actually judged people based on these data. What's really struck me about that characterization of this infrastructure, was that the state couldn't do it alone. They needed innovation. I was fascinated by your descriptions between Huawei, for example, and the state, or the emergence of WeChat as a network, and the relationship it had with the state. Can you describe a bit how these companies that are now massive global companies, often trying to go public on foreign exchanges even, were part of building this apparatus and actually benefited from building this surveillance apparatus?

Geoffrey Cain: Yeah. These companies started out on their own. I mean, they were startups, they were scrappy. A lot of these companies would just copy technologies from America, the European Union. There wasn't really much to look at 20 years ago when it came to Chinese technology. But it was with the support of the state and with the party to build a massive technological industry. Within that in particular, a massive, just colossal and sophisticated software infrastructure that existed only within China, it was unique to China, was really the way that China was able to set up this surveillance. This project really, I would pinpoint the beginning of it to 2005, which is when China set up an umbrella system that it calls Skynet, or Tianwang in Chinese, which is the same name as the terminator computer system that starts a nuclear war and kills everyone. The idea was to eventually bring together all these Chinese companies that were innovating at AI and smartphones and systems networks, so Huawei for example. Also, finding ways, finding incentives to bring Chinese researchers who had graduated from places like Stanford and MIT, or had gone through Microsoft Research Asia, and finding ways to incentivize them, to bring them together under a government supervised umbrella, because the government could not do this by itself. But it was the Chinese state led efforts to improve VC, to give out venture capital, to fund these companies. This was a long project, but I think that the big moment the Chinese executives tell me about, was in 2017 when the AI system AlphaGo, which was made by DeepMind, which is owned by Google, defeated the Chinese world champion Ke Jie. Once he lost, that was the moment when China's leaders realized that they were far behind, that they had a Sputnik moment and that they had to catch up to American technologies. A lot of Chinese tech executives told me since then, that that was really the turning point when the money started coming in, the military connections, the interest in what they were doing. That was when China passed legislation too that allowed all these software ecosystems to come together into one. WeChat would provide data on people's purchases and messages, and Weibo had their searches, Megvii had their facial recognition, Huawei had their smartphones and they're tapping their usage. These all came together and these all helped create this surveillance state in Xingjiang where everyone just had all the data just scooped up into this AI system. That was what was necessary to create this.

Taylor Owen: Just to push on that piece. I mean, I personally find this is a very difficult topic to learn about, because there is so much just, I mean, one, there's just all sorts of barriers to understanding the system. But there isn't a lot of research done in an open way about how the system worked, no journalism on it. One of the real ambiguities you consistently hear is, what happens to this? Is it the case that all data collected by all Chinese tech companies ends up somewhere on a Chinese server, or flowing through a Chinese state server of some sort? Is it that visceral, or is it more just coordination between these entities at various moments?

Geoffrey Cain: I think it's more coordination between these various entities. I did interview a Uighur technology worker, he was directly involved in setting up these exact same surveillance apparatuses in Xingjiang until 2015. He talked about how the requirements of making this work was building an algorithm or an AI system that could pull all these strands together and look at the facial recognition, look at the voice recognition, look at the WeChat messages and just gather as much data as possible and scoop it up and find those correlations to predict the pre-crimes. But the thing is, is that, I mean, he talked about how simple, just simplistic and terrible the system was at the beginning. He said that for a long time, there wasn't really a government effort to use everyone, every company together, but they would target one app for example. One app that they targeted was WeChat. They had obtained at one point starting in 2013, every single message sent between anyone in the Xingjiang region, it would be stored on government databases for two years at that point. And then they experimented with new with AI developments and they would tell the AI to look for words like bomb and gun and Quran, religion, Muslim. And then the AI would just determine that someone was a terrorist threat based on some words that they had used. I think that for a long time the system has been more slipshod and maybe helter-skelter than was always anticipated. I didn't see much evidence from him that the unifying umbrella really came until later with the AlphaGo victories and all that stuff. It was really just arbitrary, "Today we're going to do this, tomorrow we're going to do that. Let's just patch it together and see what works."

Taylor Owen: In this patchwork, it is not just Chinese technologies that are being used, but there are American and Western technologies being deployed both in a number of ways. It seems to me you describe elements of complicity here. One being the building of technologies themselves, that some of these Western companies have built that are being used in this way in China. There are companies that acquiesced to Chinese state demands in order to get access to the market in China. In some way that breeds a complicity. But also just where all this technology is built, is often built on the backs of forced labour coming from Uighur populations moved out into forced labour camps. I mean, how would you characterize the culpability of Western technologies companies in all of this?

Geoffrey Cain: This is a problem of globalization. This is the underbelly of the narrative that I think we were all sold, that globalization is going to open societies and open governments, and the new middle class is going to demand liberal reforms. All these broad statements that have been discredited for many years now that are essentially the work of people who stand to profit from selling these narratives. It's fundamentally a problem that many American companies, Apple included, various garment companies, some German auto companies were caught with their pants down in Xingjiang. They enthusiastically went into China thinking that this was going to be the future, the economic centre of gravity. But they did it fully knowing that there was a colossal risk that they were going to be directly involved in the trading and supplying and purchase of goods built with slave labour or built under severely repressed conditions and human rights abuses. What happened is that the world and China started tethering them together, and now they're all tethered at the hip economically. And then after that happened, China set up many of these Uighur slave labour programs in which people would go to camps and they would be discharged after a little while. The government would say, "All right, we're going to teach you a vocation so we're going to put you," in what they call a vocational training centre, "You can spend a few months to a few years just doing free slave labour." One of the terrible realities of slavery is that it's extremely lucrative because you have an entire workforce, that's just your profits are going to go up. It's certainly not sustainable. And now all these companies from America and elsewhere that had opened their manufacturing there have been just caught with their pants down and they're saying, "Oh shoot." I've actually had conversations with many of them where their public relations departments will say that they didn't realize that there was slave labor or they didn't know that that could happen. They essentially say, "This is not our problem, and this is not something we knew about." But the thing is that they did know. I mean, when they opened manufacturing centres in China, they did know what was going on. They did have due diligence that suggested we might have slave labour. It could be anything. I mean, not just slave labour, but intellectual property theft, major risk, and now it's happening. People are going to counterfeit our goods. This is just simply how the Chinese government operates, and this is not something that they're going to change that easily. Now these companies are having to grapple with poor decisions that they made many years ago.

Taylor Owen: I mean, you talk a lot in the book about this cold war that's emerging. I think that's maybe one narrative. But it seems to me that there's something else going on in terms of both the Belt and Road Initiative, but also the Digital Silk Road as a part of that, where some of this capacity to either control citizens or survey citizens, that capability is being exported often to a illiberal-leaning regimes who may find that set of tools quite attractive. Do you draw that connection? Are you concerned about the use of these technologies in illiberal-leaning regimes or democracies that are backsliding? If that's the case, I mean, this is a much bigger story. This is not just about the persecution of a minority inside China, this could be about the state of democracy globally. Are you concerned about that broader context?

Geoffrey Cain: Yes, I am. I'm very concerned and that's actually my intention in writing the book. I realized early on that this is not just a story about one minority in China that has 12 million people living there. That's a very small part of China, which is more than 1 billion people. This is a story about humanity. This is a story about what happens when we put bad technologies in the hands of bad people who have bad intentions without oversight, without laws, without democracy checks and balances. It's really a story about the worst that can happen in an authoritarian regime that wants to wipe out a population, not necessarily through old school, genocide tactics, not through the gas chambers and the mass graves, but through the subtle and slow burn of erasing their thoughts and who they are and forcibly sterilizing the women so they can't have babies. All these sinister tactics that are now being deployed in Xingjiang. So, yes it's a book about humanity. One of the things that deeply concerns me, that I also wrote about in the book was the export of Chinese surveillance technologies to regimes around the world. They see huge potential in using Chinese technologies to direct traffic in their cities and solve crimes, the usual, the smart cities, safe cities. This is the story that they've been sold, but the other side to that, as we're now seeing, is that these technologies are extremely useful for dictators and tyrants especially in parts of Africa, parts of South America and parts of the Middle East, just to make it easier to control their people, to know what their people are doing and to spy on them.

Taylor Owen: Yeah, I mean, God, I agree. I share that concern. That's the slide we're seeing here. It's a slide that is obviously caused by many things. But reversing of certain democratic progress that's been made globally over the last 30 years, is I think a real reality right now. It's in part enabled by these technologies. When you mentioned the Orwellian nature of this, and you end the book arguing, look, we can't say what if anymore, we have to say that this is what's happening in some very real ways and acknowledge that, and now ask what we should do about it instead. And so I guess I'll just end with that, is what do you think we should be doing about it?

Geoffrey Cain: There's no easy answer because we still don't know a ton about how these technologies work. There's still so much opacity around them. I think this is one of the side effects of the pace of technological change that we've seen, is that we now are, I think at a phase in human history when we're going to be for the next... For the foreseeable future, we're going to be playing a catch up so that our social norms and our laws and the way we govern ourselves is caught up with the pace of change and technology that's happened in the past 20, 30 years. I mean, one thing we could do, I think that so many of these problems come from a lack of transparency and I would be in favour of a new system of copyright or trademark or trade secrets that maybe requires the public to have some level of access to something that would have enormous consequences for the public squares. These companies can not be treated as completely private if our society and our democracy is to continue flourishing. Maybe one solution would be, this will be a little more radical and I think would get opposition, but some kind of partial nationalization of big social media companies, not just breaking them up, but ensuring that maybe local governments or local authorities have a partial stake in these companies and are able to vote in shareholder resolutions and able to push things in the interest of the people. In America, in Canada, too, that we already have pension funds that are major activist investors in companies trying to reform their activities, but it's not enough yet and there needs to be more corporate governance overseeing them.

Taylor Owen: I'm really glad you raise the things we can do in our own societies to make our own technology more accountable as part of a solution to this broader problem that you've identified in the Chinese use of these technologies. Because one of the most common narratives you're hearing from Silicon Valley now is, "Look, regulation will limit our ability to compete with these Chinese, unencumbered Chinese companies." It seems to me that that argument is precisely the wrong one, that really if we care about making these technologies more democratic, we should do that here and maybe show that these companies and these technologies can be governed in a way that preserve human rights and democratic principles. Maybe that's the way into this broader narrative that often bifurcates Chinese and Western technologies, but we figure how to govern our own and then show that that's possible.

Geoffrey Cain: Yes, I agree completely. I think that you just hit it right on the head. I think that there is misplaced hype over Chinese technology in America. I think that in North America, we have this perception that, oh, the Chinese, they have great schools and their kids are mastering math and becoming world-class scientists. They're catching up quickly. They're going to overtake America. They're going to beat America. Obviously, this has inklings of the Cold War mindset with the Soviet Union. China is I think today a much more technologically sophisticated power than the USSR ever was, but still I didn't find a ton of evidence from my own reporting and research that China is some kind of dominant master of technology that's going to take over the world. We should be concerned, of course, but America and the West has a strong lead already. I mean, the West has companies like Ericsson and Nokia that are building major 5G systems. We already have a lead. If we're going to give up our lead to these authoritarian tendencies that companies have the final say over the future of our democracy, the companies are going to be making technology in their interests and not in the interest of a society. That's going to end up hampering our society at large. I mean, that's when technology is actually a burden on us. We need to make sure that technology is uplifting us and making us improve in ways that we should be improving.

Taylor Owen: That was my conversation with Geoffrey Cain.

Big Tech is presented by the Centre for International Governance Innovation, and produced by Antica Productions. Please consider subscribing on Apple podcasts, Spotify, or wherever you get your podcasts. We release new episodes on Thursdays every other week.

For media inquiries, usage rights or other questions please contact CIGI.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.