The Value of Controlled Anonymity

February 5, 2020
shutterstock_1137631598.jpg
Controlled anonymity could be a valuable means of balancing privacy, trust and accountability. (Shutterstock)

As China’s influence and role in the development of technology continues to grow, democratic societies must ask whether the power of technology that’s developed in China — and often used to survey or track — can be appropriated for democratic purposes.

That question infuses the debate around the use of Huawei 5G telecom equipment: how can states take advantage of affordable next-generation wireless networks, while also limiting or preventing the potential surveillance and espionage applications? Similarly, China’s embrace of blockchain technology, and the government’s plans to issue a digital currency, offers observers an opportunity to learn and reflect upon how that technology can and should be used in a democracy.

Of particular interest, the Chinese government has introduced a concept called “controlled anonymity,” which presents a method of protecting people’s privacy that could be adopted and modified for use in other countries. Controlled anonymity is intended to translate the benefits of using cash (anonymous purchasing, for example) into the world of digital currencies and transactions.

“We know the demand from the general public is to keep anonymity by using paper money and coins…we will give those people who demand it anonymity in their transactions,” Mu Changchun, head of the People’s Bank of China’s digital currency research institute, said at a conference in Singapore in 2019, as reported by Reuters.

While controlled anonymity may seem like a contradiction, it does provide a valuable means of balancing privacy, trust and accountability.

The Chinese government has not released full details on how controlled anonymity would work, but the general premise is that the individual using the digital currency would have anonymity when dealing with the marketplace, although not necessarily with the government. The vendor selling you something would not be able to identify you, or collect personal information about you, but the state could.

While its application in China might not offer the full anonymity many people desire, the method could be adapted by a democratic society to much greater effect.

However, not everyone in a democratic society recognizes the value of anonymity. Anonymity is often used as a scapegoat or excuse for uncivil behaviour and discourse online. Yet, in the context of certain financial transactions, anonymity is incredibly important. For example, the federal Office of the Privacy Commissioner recommends that Canadians buy their cannabis with cash. When a person is buying a controlled substance that is illegal elsewhere, data trail could make that person’s future border crossings difficult or impossible. This is a good example of controlled anonymity’s value; it anticipates the need to be able to make legal purchases using digital currency while also exercising the right to privacy and security.

Controlled anonymity, however, is not just about financial transactions; it can also be applied to identity, expression and participation in online communities and networks.

The ongoing debate around the “real names” policies upheld by platforms such as Facebook emphasizes the importance of people’s right to be anonymous or to use pseudonyms. The policies aren’t all bad — they were written, at least in part, to encourage the use of legal names and reduce fraud, limit the influence of bots, and to encouraging civil and accountable behaviour.

However in 2011, internet researcher danah boyd argued that insisting on “real names” is an abuse of power: “The people who most heavily rely on pseudonyms in online spaces are those who are most marginalized by systems of power. ‘Real names’ policies aren’t empowering; they’re an authoritarian assertion of power over vulnerable people. These ideas and issues aren’t new…but what is new is that marginalized people are banding together and speaking out loudly.”

In 2015, the “Nameless Coalition” brought together more than 75 human rights, digital rights, LGBTQ and women’s rights organizations from around the world, all asking Facebook to fix its “authentic identity” or “real name” policies. The coalition argues, as danah and others have, that forcing people to use their legal names is dangerous and can place already marginalized groups into situations of increased harm and vulnerability.

As the surveillance economy expands and grows, more and more people are recognizing the value of limiting the amount of personal information they disclose. If the ability to be anonymous was more available, more people would choose it as an option to protect themselves and their privacy.

How, then, might controlled anonymity work, and how could it benefit a democratic society? “Know your customer” (KYC) laws provide a possible model.

KYC laws are largely designed to prevent businesses from becoming complicit in criminal activities, while also mitigating fraud. The idea is that if a business knows who it is dealing with, it can ensure that it and its customers are not breaking any laws.

Controlled anonymity could be considered as a sort of digital extension of KYC laws, except the business would not need to know the customer, as long as a “trusted party” did. Of course, some questions remain: who would that trusted party be, and would the trust be sufficient for all transactions and interactions?

On a technical level, distributed ledger technologies, such as blockchain and hyperledger, can enable the independent verification of identity. This emerging infrastructure enables distributed identity and trust services that make the concept of controlled anonymity possible.

The model that the Chinese government seems to be proposing is one in which the state acts as the trusted intermediary. An individual can make financial transactions anonymously, because (ostensibly) the individual and the vendor both trust the state to “know the customer” and provide the trust for the transaction.

However, in a democratic society, the state may not (or should not) be the trusted intermediary. Rather than eliminate the possibility of controlled anonymity, a different trusted intermediary could be used (or, legal restrictions could be placed upon how the state would operate as a trusted intermediary).

For instance, rather than the government being the intermediary in control of anonymity, the intermediary could be a government agency that operates independent of the government. There could be provisions in place for the circumstances or conditions upon which that anonymity could be rescinded, such as in instances of illegal activity and fraud.

In many situations, a government agency that is distinct from the government might not be trusted enough to satisfy some elements of a democratic society. In such cases, a civil society organization could take on the role.

And finally, there is potential for a free market version of controlled anonymity. This version would involve establishing an industry in which a number of companies provide anonymity as a service, subject to oversight and regulation from governments. These companies’ presence would allow people greater choice when deciding which bodies they trust to protect their identity.

As a concept, controlled anonymity is a responsible and legitimate means of providing the right to anonymity while also preserving the legitimate needs of a society based on the rule of law. It is a powerful example of how democratic societies can benefit from the technology being developed by an authoritarian regime — as long as there is a willingness to put in the effort to democratize that technology. 

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Jesse Hirsh is a researcher, artist and public speaker based in Lanark County, Ontario. His research interests focus largely on the intersection of technology and politics, in particular artificial intelligence and democracy. He recently completed an M.A. at Ryerson University on algorithmic media.