Canada's Minister of Innovation, Science and Industry, Navdeep Bains, speaks during Question Period in the House of Commons. REUTERS/Chris Wattie
Canada's Minister of Innovation, Science and Industry, Navdeep Bains, speaks during Question Period in the House of Commons. REUTERS/Chris Wattie

Canada’s private sector data protection law is due for a major overhaul — and Bill C-11, tabled in Parliament on November 17, 2020, promises exactly that. If this bill — also known as the Digital Charter Implementation Act, 2020 — is passed, the Personal Information Protection and Electronic Documents Act (PIPEDA) will be replaced by two new statutes. The Consumer Privacy Protection Act (CPPA) will replace the normative core of PIPEDA and the Data Protection Tribunal Act (DPTA) will establish a new administrative tribunal to oversee the CPPA’s enforcement regime.

Bill C-11 was made necessary by two distinct but related factors. First, PIPEDA has fallen sadly out of step with a burgeoning data economy. Recent egregious data security breaches and high-profile misuses of personal information have highlighted PIPEDA’s weak compliance incentives. In addition, PIPEDA is not well adapted to the demands of the data economy. Privacy advocates have complained about the unrealistic burden of consent placed on individuals, and about the need for more and stronger rights to control personal data. Businesses have bemoaned the lack of flexibility in PIPEDA to accommodate uses of data for purposes other than those for which consent was originally obtained, the inability to exempt de-identified information from the application of the statute, and insufficient procedural fairness in the complaints process. Bill C‑11 responds, at least in part, to these and other concerns, although it remains to be seen to what extent anyone will be fully satisfied.

PIPEDA is not well adapted to the demands of the data economy. Privacy advocates have complained about the unrealistic burden of consent placed on individuals, and about the need for more and stronger rights to control personal data.

The second factor driving PIPEDA reform is the looming adequacy assessment that Canada faces under the European Union’s General Data Protection Regulation (GDPR). The GDPR, which took effect in 2018, has set a new global standard for data protection. It bars the flow of personal data outside EU borders unless a substantially similar level of protection is available in the country of destination. There are a number of means by which this requirement can be met. The most expedient of these is a determination that a country’s data protection regime provides an adequate level of protection. PIPEDA was unlikely to pass a GDPR adequacy assessment; Bill C-11 should come closer to the mark.

Bill C-11, while an important proposal for data protection law, presents a number of issues that may prove contentious as it moves through Parliament and the Senate.

The Rewrite

An important feature of Bill C-11 is that it is does not just amend PIPEDA but rewrites it altogether. PIPEDA was Canada’s first national private sector data protection law, and was introduced at a time when there was considerable private sector resistance to legislating data protection. As a compromise, the Canadian Standards Association’s Model Code for the Protection of Personal Information was adopted as the normative core of the legislation. This code was drafted with diverse stakeholder input and was seen to represent a business and consumer-advocate consensus on data protection. Because it was adopted wholesale, it was included in a schedule to PIPEDA, and retained its open-ended drafting style. PIPEDA consists mostly of a series of modifications to the normative principles in the code, as well as some exceptions to them, with the completely unrelated Electronic Documents Act grafted on for good measure. PIPEDA’s norms are found in a schedule at the end of the statute. It makes for an awkward and inaccessible data protection law.

Bill C-11 dispenses with the Model Code (although the same fair information principles remain at its normative core). And, reflecting the growing maturity of data protection law in Canada, Bill C-11 presents a cohesive set of obligations drafted for technological neutrality, readability and internal coherence. These are big improvements.

A downside, of course, is that the wholesale rewriting of PIPEDA makes it very difficult to easily see what is new, what is the same, what has been slightly altered and where everything can be found. This is a substantial challenge for stakeholders who will be seeking to comment on the bill as it moves through Parliament. It makes it difficult to grasp quickly and comprehensively both the full scope of the changes and their implications. The fact that so much has been changed — sometimes in ways that may stir controversy — may also slow the bill’s movement through Parliament and the Senate. For those who have been anxious to see an upgrade to Canada’s data protection laws, this bill may not offer a quick solution.

Consent

PIPEDA is a consent-based data protection regime. Consent is the default for the collection, use or disclosure of personal data, although the law also contains an ever-growing list of exceptions to consent. In 2015, section 6.1 was added to set out clear criteria for valid consent. Consent was valid only “if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting” (italics mine). This meant that if a product or service were targeted at youth or children, for example, consent might have to be specifically tailored to their level of understanding.

Bill C-11 retains consent as its core, although it subtly alters what is required for valid consent. A new section 15 provides that consent is only valid if the organization, on or before the time that consent is sought, provides the individual with prescribed information in plain language. The implications of the shift away from the ability of the targeted consumer to understand to a plain-language list of details should be carefully considered. Note that the privacy of children and youth is not specifically addressed in Bill C-11. Arguably, this change to the nature of valid consent will make the law even less responsive to their data protection needs.

Exceptions to Consent

Organizations were keen to gain some new exceptions to the consent requirement, particularly for what they considered to be “legitimate business purposes.” From their perspective, there were instances in which new business purposes might arise for which consent had not been obtained at the time of collection — and where retroactively obtaining consent would be cumbersome and complicated. The government was also sensitive to concerns about the consent burden on individuals who were confronted by privacy policies packed with an overwhelming amount of detail about organizations’ collection, use and disclosure of information. The result of this pair of preoccupations is a new set of exceptions to consent found in sections 18 to 21 of Bill C-11.

Businesses will no doubt be pleased to see these new exceptions. They absolve organizations of the requirements of knowledge and consent for activities necessary to provide or deliver a product or service that the individual has requested from the organization; related to reducing an organizations commercial risk; necessary for the information system or network security; necessary for the safety of products or services provided by the organization; or where obtaining knowledge and consent would be impracticable because the organization has no direct relationship with the individual. Section 18 also provides that the list of business activities for which knowledge and consent are not required is an open one — it can be expanded by regulation. The exceptions in section 18 are subject to two important limits designed to preserve privacy: the business activity must be one for which a reasonable person would expect such a collection or use, and the personal information collected must not be used for the purpose of influencing the individual’s behaviour or decisions.

Sections 19 to 21 would allow organizations to also transfer information to a service provider for processing and to de-identify information without knowledge or consent. Organizations may also use de-identified personal information for internal research and development purposes.

Although organizations may, on balance, be content with these new exceptions, there are reasons for individuals to be wary. These are not just exceptions to consent, but to knowledge and consent, namely, that an organization need not even provide notice of these practices. While these provisions may achieve the goal of shortening privacy policies and reducing consent burden on individuals, they also make these practices entirely non-transparent. And, while the limitations placed on organizations in section 18 are important ones, they are useless if individuals cannot know what they need to know (the information that’s being collected and how it’s being used) to hold them to account. A new requirement in section 62 to “make readily available” a “general account” of how the organization uses personal information “including how the organization applies the exceptions to the requirement to obtain consent under this Act” seems unlikely to be satisfactory both because of the generality of the account that must be provided and the fact that it must only be “readily available” rather than serving a notice function.

The breadth of some of these new exceptions is also a concern. Paragraph 18(2)(e) allows data collection without notice or consent for “an activity in the course of which obtaining the individual’s consent would be impracticable because the organization does not have a direct relationship with the individual.” It is difficult to know what exactly was intended by this provision, but it could potentially extend to a very broad range of data collection — including the harvesting of data from social media sites. Paragraph 18(2)(b) permits collection and use without knowledge or consent for “an activity that is carried out in the exercise of due diligence to prevent or reduce the organization’s commercial risk.” While this might seem innocuous at first, a second glance reveals the enormous potential for the collection and use of data to assess an individual's ability to make payments in a range of contexts, from financial services to the purchase of products and the rental of apartments.

New Rights of Control: Portability, Erasure and Artificial Intelligence

Bill C-11, if passed, will introduce new data rights for individuals. These rights will give them greater control over their personal information in some contexts. One such right — somewhat limited in Bill C-11 — is data portability. Under the GDPR, this is known as data mobility; there, it is a more general right to port data from one service provider to another in a machine-readable format.

Data portability is not a privacy right per se. It is more closely linked to competition and consumer rights, although it is tied to individual control over personal data. Services that depend on understanding their clients, for example, have a competitive advantage over newcomers in the field, because they have had the opportunity to collect the data that allows them to tailor their services to each customer’s needs or interests. Data portability allows consumers to take their data with them to a new service provider, ensuring — in theory, at least — an equivalent level of service. In other contexts, data portability might allow for new services to emerge that could provide analytics based on the ported data.

While this might seem innocuous at first, a second glance reveals the enormous potential for the collection and use of data to assess an individual's ability to make payments in a range of contexts.

Bill C-11 will not include a broad-based data mobility right. Instead, data portability will be sector-specific and carefully rolled out through regulations, complete with the necessary standards, security safeguards and infrastructure. Expect the first data portability experiment to be with open banking (also known as consumer-directed banking). Open banking has been in the works for some time, and is currently being considered by the Department of Finance. For constitutional reasons, data portability under this model will likely be limited to areas of federal jurisdiction. The telecom sector might be another candidate for future development.

Another new right is the right of erasure, found in section 55. This provision requires an organization to take steps to dispose of personal information at an individual’s request. The organization must also require any service provider to which it has provided the information to dispose of it. This right of erasure is not the same as a right to be forgotten —which is absent from this statute (but that might still be part of any eventual platform governance initiative by the federal government). The right to erasure requires organizations to dispose of personal information, although it does not require search engines to de-index certain sites containing personal information.

Bill C-11 also contains some new rights regarding the use of personal data in automated decision-making processes. These are interesting. Automated decision making is likely to have a growing impact on individuals. Paragraph 62(2)(c) would require organizations to provide an account of their use of automated systems “to make predictions, recommendations or decisions about individuals that could have significant impacts on them” to the individuals. Section 63 provides for the individual’s right to an explanation of automated decision making, as well as an explanation of how their personal data that was used in the process was obtained. Notably, this right applies not only to actual decisions but also to predictions and recommendations — giving it much greater scope.

Enforcement

PIPEDA had been widely criticized for its lack of adequate enforcement, and enforcement is clearly an area of attention in Bill C-11. The bill provides Canada’s privacy commissioner with new order-making powers. The commissioner will also have the authority to recommend that potentially substantial fines be imposed on organizations that have breached specific obligations. However, these powers are checked by the new data protection tribunal. This tribunal will hear appeals of orders made by the commissioner. It will also consider any recommendation by the commissioner to impose a fine on an organization, and it will set the amount of any fine. This separation between the power to fine and the commissioner is not unreasonable if the goal is to preserve some of the ombuds role of the commissioner. However, questions have already been raised regarding the composition of this new tribunal. The Data Protection Tribunal Act provides, somewhat surprisingly, that only one of the six members of the tribunal need have expertise in privacy.

Bill C-11 also proposes a new private right of action for breaches of the CPPA. The right of action would be available for any contravention of the act, although it would depend on first having gone through the complaint and appeal process. Recourse may be impacted by new powers of the commissioner to decline to deal with some complaints. Even where a complaint is investigated, the overall process could be quite lengthy, raising issues of how timely any such right of action would be. And, while it does seem like a measure designed to provide individuals with additional recourse for breach of privacy, it raises the issue of whether, if passed, the new statute will constitute a “complete code.” If the new statute is seen as a complete code, it might adversely impact the availability of other legal recourse for breaches of privacy obligations by private sector actors. This aspect requires closer scrutiny, as its overall impact on access to justice might be quite different from what was intended or desired.

Data for Good

In the wake of the debates over data governance in the Sidewalk Toronto project, it is interesting to consider section 39 of Bill C-11, which allows for de-identified data to be used without knowledge or consent of the individual, for “socially beneficial purposes.” Those who may use the data for these purposes include federal or provincial government institutions, health care or post-secondary institutions, or public libraries. They also include “any organization that is mandated, under a federal or provincial law or by contract with a government institution or part of a government institution in Canada, to carry out a socially beneficial purpose,” which could be a so-called data trust. A socially beneficial purpose is defined in the legislation as “a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.” While section 39 has interesting potential, it may, by enabling some kinds of data sharing in the public interest, by implication limit sharing that does not fit its limited definition of “socially beneficial purposes”. It also does not provide for or require the use of contractual or other measures to govern the use of the de-identified data.

What Remains Untouched

While it is important to consider all that is new in Bill C-11, it is also important to note that a great many provisions of PIPEDA remain substantially untouched — and that there are areas where, perhaps, more attention might have been warranted. For example, Bill C-11 carries over the exclusion of the law’s application to the collection, use or disclosure of personal information for “journalistic, artistic or literary purposes,” regardless of the fact that these categories have proven challenging in the era of new media. The exception to consent for the collection, use and disclosure of “publicly available personal information” remains subject to articulation through regulation, notwithstanding the fact that there is considerable pressure to declare information on social media sites to be publicly available information, which could have potentially significant consequences for individual privacy. And, Bill C-11 leaves untouched the exceptions to the requirement of consent for law enforcement access to data in the hands of the private sector. In 2015, Industry Canada produced voluntary transparency reporting guidelines to provide greater transparency over this gaping data conduit between the private sector and law enforcement. Bill C-11 does nothing to formalize transparency reporting requirements that would enable the public to have a better sense of just how often law enforcement officials turn to private sector organizations for access to the data they hold about individuals. Bill C-11 also does little to change or bolster how cross-border data flows are addressed under PIPEDA.

The government clearly has no stomach to require political parties to play by the same rules it imposes on private sector actors when it comes to collecting, using, disclosing or securing personal data.

Another notable gap is, of course, the failure of Bill C-11 to bring federal political parties into its ambit. The government clearly has no stomach to require political parties to play by the same rules it imposes on private sector actors when it comes to collecting, using, disclosing or securing personal data.

Bill C-11 is a substantial reform of PIPEDA, raising issues that include, and go beyond, those addressed in this comment. The bill is remarkable for its scope and for the extent to which it has transformed PIPEDA to meet the need for data protection in the evolving data economy. However, change on this scale inevitably raises a host of questions and issues. Some of these will be about entirely new provisions that may or may not hit the right mark. Others will be about the rewording or reframing of existing provisions, and still others will relate to what was left out of this round of reform. The upcoming study of this bill in committee will likely be the occasion for one of the most important conversations Canadians will have on the future of data protection in this country.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.
  • Teresa Scassa is a CIGI senior fellow. She is also the Canada Research Chair in Information Law and Policy and a full professor at the University of Ottawa’s Law Faculty, where her groundbreaking research explores issues of data ownership and control.