The Charter and Human Rights in the Digital Age

August 16, 2018
shutterstock_300344546-2.jpg
Individuals are often faced with a "take it or leave it" decision when it comes to personal data storage, control and privacy. (Shutterstock)
On May 16, Prime Minister Justin Trudeau announced that the federal government will release a digital charter that will outline Canada's approach to hate speech, misinformation and election interference on the internet. The decision addresses significant gaps in existing, pre-internet policy.

When the Canadian Charter of Rights and Freedoms became part of the Constitution Act in 1982, it created an authoritative set of rules that circumscribed state authority and embedded certain rights into the very fabric of Canadian society. In doing this, the Charter also created a powerful moral imperative that reflected the core of Canadian values.

This includes fundamental freedoms around expression and religion; democratic rights; mobility rights, such as the right to leave and re-enter the country; legal rights, including the right not to be arbitrarily detained, and the rights to life, liberty and security of the person; and equality rights, including the right to equal protection and equal benefit of the law without discrimination.

The Charter has long been considered a “living tree,” meaning that it must be understood within the context of an ever-changing society, allowing for a “progressive interpretation, [that] accommodates and addresses the realities of modern life,” as noted in the Supreme Court’s 2004 judgment on same-sex marriage. However, it is unlikely that the drafters of the Charter could have foreseen the extent to which technology would change the Canadian economy, political system and society in the next 35 years. 

Today, while the moral imperative around these rights remains clear, the role of private actors in policy space has fundamentally changed the nature and quality of certain rights, including privacy and free expression. While constitutional protections are in place, they only apply to an entity that is part of government, substantially controlled by government or performing governmental activities. They do not apply to the private sector or technology companies, which are playing increasingly pervasive roles in the everyday lives of Canadians.

Outside of the Constitution, there is also a statutory framework in place that applies to both government and the private sector. As an example, access to data held by the government, and the corresponding privacy protections, would be subject to the Privacy Act or the Access to Information Act, or to their provincial counterparts — depending on the context. Unfortunately, these laws have long been the subject of criticism for being outdated and for failing to keep pace with technological advancement.

Canada’s private sector data protection laws are equally problematic. The federal Personal Information Protection and Electronic Documents Act (PIPEDA) requires organizations subject to the act to obtain an individual’s consent before or at the time that their personal information is collected, and when a new use of their personal information is identified. However, the reality is that consumers usually scroll through lengthy agreements that describe how their personal information is used. As such, the current system is premised on informed consent, but the practical effect is that most people have no idea what they are agreeing to, or how their data will be used. Nor is there a meaningful alternative in many instances — individuals face a “take it or leave it” proposition.

PIPEDA also includes inadequate penalties for data breaches, particularly in comparison with Europe’s newly implemented General Data Protection Regulation (GDPR). Organizations in breach of the GDPR may be fined up to four percent of annual global revenue or €20 million, whichever is greater. There are no similar monetary penalty provisions in PIPEDA; instead, the act allows the privacy commissioner to apply to the Federal Court, which may award damages to the complainant. Stronger and more consistent penalties for privacy breaches are needed to better reflect both global trends and the true value of the personal information being amassed.  

Data Collection and Control

In the context of a flawed statutory scheme, and with Charter protections limited to governmental activities, smart cities expose the tension between public and private sectors and highlight the blurring of the lines between the two in the collection, use, ownership and control of data. This data can include “hard” information collected about the individual, such as their physical location and facial features, as well as “soft” information, including their behaviour. Another concern is the use of closed-circuit cameras for surveillance in smart cities — the lines between the public and private sectors are hazy here as well. Due to the vast improvement of facial recognition technology, information might be extracted that could generate privacy concerns.

Data Residency

The use of cloud computing and web-based email services often requires that information and data leave Canada. The travel of data across borders not only implicates its owner’s personal privacy but also has legal implications, in that the information and data could be subject to laws from outside of Canada. In the public context, a directive concerning federal public sector data was issued to mitigate these concerns and highlight the requirement that certain security classifications of sensitive data must stay within Canada’s geographic boundaries unless it can be stored in a secure, approved computing facility (such as a consular mission). However, individuals’ data is largely able to leave Canadian territory and control.  

Artificial Intelligence

Artificial intelligence is being used in a number of different public sector domains, including health care, justice and social goods allocation. Private sector companies have been developing machine and deep learning algorithms to help these public sector entities make decisions, predict outcomes and assess risks. In many of these scenarios, the individuals deploying the algorithm and the individuals whom the algorithms ultimately affect do not know the underlying reasons and determinants for the decisions that are made, because the algorithm and training sets are either protected by trade-secret law or considered to be “black box” systems. Examples of discrimination and exacerbation of biases have been proliferating over the last few years and are clearly an issue in need of attention from Canadian policy makers.

As originally conceived, the Charter was meant as a check on state power in favour of the individual. However, given that the Charter protections apply only to governmental authority, and the fact that current legislative protections need an overhaul, the time is right to think about elevating protections and questioning the adequacy of existing rights.

Do Canadians enjoy sufficient privacy protections and have suitable avenues of redress in the event of a breach? Do they understand the rules of ownership related to their personal data? Do they know what they are agreeing to when they provide consent for digital services? If the answer to these questions is no, then it is time to design a Charter for the digital age.  

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Aaron Shull is the managing director and general counsel at CIGI. He is a senior legal executive and is recognized as a leading expert on complex issues at the intersection of public policy, emerging technology, cybersecurity, privacy and data protection.