“Privacy Is Like Yoga” — and Other Myths

Some technologies are inherently intrusive and repressive, and we should not be neutral about them.

February 8, 2023
2022-05-10T232722Z_200618842_MT1HUAWEIDXPVLN9ZM00HZ314ICIQ3NUB0P0_RTRWEFP_4_REUTERSPLUS-HUAWEI
(Huawei via REUTERS)

On January 24, 2022, the then president and CEO of the Canadian Marketing Association opined that data privacy laws have a lot in common with yoga: “The practice of yoga is all about finding the perfect balance between meaningful effort and not overstretching. When you find that point, everything seems to flow.” He went on to assert that “data privacy laws have a lot in common with the concept behind yoga. They too require balance. Consumers need to know that their data is protected, and not mishandled or misused. … But if the law goes too far, it stymies innovation, reduces personalization, contributes to ‘consent fatigue’ and disadvantages the very consumers it is intended to benefit.”

The metaphor about finding the “perfect balance” between privacy and business interests has reappeared regularly over the years. It is also frequently referenced in the narrative surrounding the modernization of the Canadian law regulating private sector data processing. Bill C-27 — the Digital Charter Implementation Act, 2022, which incorporates the Consumer Privacy Protection Act — is still going through its second reading.

As the lobbying for and against this bill continues, it is worth unpacking and questioning the assumptions that have been driving a consistent business narrative about the need for a Canada-made solution to privacy protection that is flexible, neutral, principles-based and, indeed, balanced.

The word balance appears a lot in this narrative. But there is no perfect balance to be struck. It was not a helpful metaphor when Canada first regulated corporate personal data processing through the Personal Information Protection and Electronic Documents Act (PIPEDA) in 2000. It is even less helpful now.

Let’s go back to that time, before widespread electronic commerce, before contemporary personalized advertising, and before the widespread development of a global digital services sector where wealth is based on the capture, processing and monetizing of personal information.

Suppose, back then, that the media had reported that companies were not only monitoring what people were buying but what they were thinking of buying, or even predicted to be thinking about buying, or feeling like buying. Suppose companies were actively following people around shopping malls and recording not only what they purchased but also what they were looking at in shop windows, and for how long. There would have been outrage. Nobody would have concluded that there was a need to “balance” the legitimate needs of business with the rights of individuals.

And yet exactly that is happening on the internet today, largely in secret and involving an opaque network of global actors and technologies that no one regulator can completely understand, let alone hold to account.

The business models and practices that were once considered unacceptable get normalized over time. They morph from the unacceptable to the acceptable. And rhetoric about balance, or finding the perfect balance, permits this. We are now expected to balance privacy rights with practices that would once have been considered off limits. When personal data is monetized, the appetite for the data grows, while citizens will continue to be appeased with soothing bromides that “your privacy is important to us.”

A related, and equally problematic, assumption is that any privacy protection policy needs to be technologically neutral. For instance, the government’s original white paper introducing the Digital Charter extolled the virtues of a “principles-based and technology-neutral approach.” But what does it mean to say that privacy protection policy should be technologically neutral? Technologies are not neutral. They embody hidden biases designed, or not designed, to shift human behaviours in certain directions.

Public policy shapes outcomes. Transportation policy can be made to encourage cycling, driving or public transportation. Energy policy can be made to shift or nudge our incentives toward sustainable and renewable sources of energy and away from fossil fuels. Few would argue that transportation or energy policy should be technologically neutral. Yet, for some reason, policy making in the privacy realm still sticks to this one-liner, as if it is a simple and well-understood principle.

Some technologies are inherently intrusive and repressive, and we should not be neutral about them. Just as energy policy can nudge us toward sustainability, our privacy policy should nudge us toward stronger privacy protection rather than more surveillance.

So-called technological neutrality is normally taken to mean that it is the effects or the uses of technology that must be regulated, rather than the tools. Thus, there is no need for law to change, as technology changes.

Relatedly, the claim of technological neutrality also implies that law makers should not try to pick winners and losers, because that would stifle “innovation.” But why not? The argument rests on the assumption that innovation is always good. It is not. The history of our digital economy is littered with examples of innovations that floundered because they turned out to be overly intrusive, repressive or plain silly.

Some technologies are inherently intrusive and repressive, and we should not be neutral about them. Just as energy policy can nudge us toward sustainability, our privacy policy should nudge us toward stronger privacy protection rather than more surveillance.

Another, related assumption is that privacy law should be based on a set of flexible principles. Here again is the Canadian Marketing Association, which has argued that privacy protection law should be “rooted in an administratively workable, principles-based legislative framework that promotes a technology- and sector-neutral approach to privacy, helping to ensure flexibility in the face of rapidly evolving technologies, business models and consumer expectations.”

This distinction between principles-based legislation and other presumably more detailed and prescriptive frameworks is another simplistic article of faith in Canada. Of course, PIPEDA is based on a set of enumerated principles, initially negotiated and embodied in the Canadian Standards Association Model Code for the Protection of Personal Information.

But some laws do not enumerate a set of “principles” like PIPEDA, and yet are equally principles-based. Principles are couched in more abstract language, and therefore exportable to different jurisdictions. And the agreement on a core set of privacy principles has allowed for a common dialogue and a remarkable spread of these laws globally. Currently, that spread is inspired by the European General Data Protection Regulation (GDPR), the so-called gold standard for international data protection.

And the GDPR is also principles-based. The GDPR’s Article 5 even stipulates the “principles relating to processing of personal data.” There is no useful distinction to be drawn between data protection laws based on principles, and those that are arguably more detailed and prescriptive.

So, why the continued insistence by government and corporate actors on a principles-based law, as if there is a clear distinction between a Canadian-made law, and other more prescriptive (read: European) approaches? Again, it is a rhetorical device, designed to draw an inaccurate distinction between different privacy regimes, and pose the supposedly more prescriptive, top-down and less principled, and one-size-fits-all European approaches as something to be avoided in Canada. This misunderstanding leads to assertions that the GDPR has created a “staggering regulatory burden,” according to the Canadian Marketing Association. It supposedly hampers the ability of organizations to innovate. It disproportionately impacts small and medium-sized businesses. It creates complexity for consumers. It creates a host of unintended consequences. It suppresses emerging technologies. It obstructs cross-border business.

Yet, the GDPR was, and remains, a compromise whose provisions were fought over for many years between different interests. It is not overly prescriptive or one-size-fits-all. It gives businesses a number of options for legally processing personal data, including where processing is necessary for the purposes of the “legitimate interests” pursued by the controller — language that has, regrettably in less rigorous form, been inserted into the latest draft of Bill C-27.

Further, the GDPR contains all kinds of provisions imported from outside Europe — privacy impact assessments, privacy by default and by design, codes of practice, privacy seals and certification schemes. These instruments were once viewed with skepticism by European data protection officials and experts, because of their association with the more “flexible” or “self-regulatory” approaches in other countries. They are now embraced.

What is more, the GDPR is already forcing compliance to its terms by many businesses operating in Canada. Some big international companies such as Apple and Microsoft have simply declared an intention to level up their global operations to be compliant with GDPR standards. Others are owned by larger European multinationals. Still other companies try to distinguish in their privacy policies between the privileges they grant to European nationals and those they apply to Canadians. Moreover, as the Office of the Information and Privacy Commissioner for British Columbia has argued, compliance to what is regarded as the gold standard of international data protection does offer competitive advantages.

In this context, why do we hear so much talk of a “made-in-Canada” approach to the problem? Does the digital economy change its character when it hits our borders? Are the privacy rights and interests of Europeans somehow more important than those of Canadians? There might have been a discernible Canadian approach back in 2000, when PIPEDA was passed. But there certainly isn’t now. The world has moved on. More than 140 countries now have data privacy laws. Many are influenced by the GDPR.

Thus, we get nowhere by setting up a false dichotomy between a balanced, flexible, “neutral” and principles-based Canadian approach and a supposedly more rigid and inflexible European regime. The discourse about the modernization of privacy law should not be rooted in dated rhetoric, false dichotomies, incomplete understandings of overseas practices and a poor appreciation of the complex debates in the privacy literature.

The assertion that privacy is like yoga is one of the more simplistic one-liners about privacy policy. But it is not the only one.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Colin J. Bennett is a professor in the Department of Political Science at the University of Victoria.