I

magine that a team of researchers comes up with a brilliant new idea for predicting the likelihood that a patient will die from an opioid overdose. These researchers identify a set of indicators that, when combined, make it more likely that someone’s drug use will turn deadly — factors such as their health records, whether they have previously been admitted to the emergency room for an overdose, or whether they associate with known drug dealers. By working with data scientists, the researchers develop an algorithm that can identify the highest-risk patients for treatment and intervention.

This technology could potentially save hundreds of lives, and the information needed to feed the algorithm already exists in various databases — such as medical files, hospital records, police records or even social media accounts. But this information is also highly sensitive, and while combining it in innovative ways could lead to a public health breakthrough, privacy laws would impede the researchers from cross-referencing these separate databases.

Is there a way for these researchers to securely access the various data sets in a way that serves the public interest while also protecting patients’ rights to privacy? How could they do this while maintaining the trust of the agencies and individuals involved? And what consequences would they face if they failed to uphold that trust?

Canada must build on its existing policy and legal framework to make it easier for organizations to prove they can be trusted to keep this sensitive information private and secure.

This is a hypothetical scenario, but this kind of technology already exists, so these questions have real implications for Canada. Massive amounts of data about all aspects of our lives are being collected and stored, and they have the potential both to benefit society and to create innovative new businesses that contribute to our economy. But in order to make the most of this potential, Canada must build on its existing policy and legal framework to make it easier for organizations to prove they can be trusted to keep this sensitive information private and secure, and for the public to evaluate their trustworthiness and hold them to account.


 

The Global Data Race

Data is a lot like capital — it flows in and out of a country. If we cannot find a way to attract data investments, Canada risks becoming a client state with regard to data. But if Canada becomes the best environment to maximize the economic and social power of data, more data will flow in. To do that, it needs to prove it is better than its global rivals, such as the United States, the European Union and China, in three areas: trust, investment opportunities and data integration.

Trust is the most important factor because it underpins the other two. When a country establishes a climate of trust in its data environment, individuals both at home and abroad have faith that their personal information will not be compromised by security breaches or unscrupulous data practices. China is widely seen as having low trust due to its widespread digital surveillance practices. With the United States’ history of avoiding comprehensive regulation, the US approach has traditionally been trusted by its citizens, who are wary of government overreach, but that has started to change with the revelations about social media company practices, such as the Cambridge Analytica scandal. The European Union, by contrast, has its own history behind its comprehensive privacy framework. This framework was recently strengthened by the General Data Protection Regulation (GDPR), which creates even more stringent standards for how companies doing business in the European Union manage users’ data and gain their consent to do so.

Privacy regulations in the European Union and Canada are built on principles of openness, transparency and accountability with respect to the personal information companies collect. These principles require consent: individuals must be informed at the time of collection about what personal information is collected about them, how it will be used and the legitimate grounds that the company has for that collection. (Photo: pixinoo / Shutterstock.com)
Privacy regulations in the European Union and Canada are built on principles of openness, transparency and accountability with respect to the personal information companies collect. These principles require consent: individuals must be informed at the time of collection about what personal information is collected about them, how it will be used and the legitimate grounds that the company has for that collection. (Photo: pixinoo / Shutterstock.com)

The GDPR also introduces massive penalties for companies that fail to comply with its standards, and that has the potential to harm the second area of competition: investment opportunities. To attract investment, a country must make businesses feel confident that the investments they make there will not bankrupt them. The threat of losing up to four percent of annual global turnover in the event of a GDPR violation may create a chill among companies thinking of setting up data-related businesses in the European Union, or simply doing business there (as the GDPR has extraterritorial application). There are also serious concerns that well-established businesses — corporate giants with significant legal and human resources — will be in a much better position to navigate the new system, to the detriment of small start-ups that do not have the same resources. This is where the US system, with its lighter regulatory environment, may look like a better option for companies wanting to set up data-related businesses. China, on the other hand, is considered to have a strong environment for domestic investment, but less so for international investment.

The final aspect of building an attractive data environment is data integration, or the ability to draw from existing data sources, possibly in combination, to create innovative opportunities. China has no shortage of options for achieving this, but it comes at the cost of genuine openness and transparency. By contrast, privacy regulations in the European Union and Canada, for example, are built on principles of openness, transparency and accountability with respect to the personal information companies collect. These principles require consent — that is, individuals must be notified at the time of collection about what personal information is collected about them, how it will be used and the legitimate grounds that the company has for that collection. Under EU laws, using data that individuals have already disclosed, but for purposes other than those for which it was collected, is not permitted, unless the company can demonstrate that the new purpose is compatible with the original purpose.

There are also serious concerns that well-established businesses — corporate giants with significant legal and human resources — will be in a much better position to navigate the new system, to the detriment of small start-ups that do not have the same resources.

Linking personally identifiable information to other sensitive information about users — for example, their health data, locations they have visited or even their consumer purchasing history — poses additional privacy risks. And the possibility that such sensitive information could fall into the wrong hands as a result of a data breach or cyberattack would surely be cause for concern.

Improving Canada’s standing in all three areas is achievable, but will require additional policies that create a more secure, predictable environment for data management. And at the heart of that is changing how we think about and measure trust.

Measures of Trust

Some people may think that the purpose of regulation is to prohibit certain dangerous behaviours in order to protect the public interest — for instance, nuclear operators are not allowed to dump nuclear waste into the water supply or use the technology at their disposal to build weapons. But another way of looking at regulation is as a mechanism to permit certain dangerous behaviours that have a public benefit under certain conditions. Nuclear power can be dangerous, but if it is managed correctly, it can also benefit society as an energy source. So rather than outlawing nuclear power, the government imposes certain standards that nuclear facilities must meet in order to operate.

Data activities are essentially the same. Yes, there are inherent risks to collecting and managing individuals’ personal information, including risks of privacy overreach or having the data exposed in a security breach. But there are also potential benefits to combining various data sources as discussed above in order to develop new and innovative uses of this data in the public interest. A change in approach would put more emphasis on the ways that individuals or entities could be verified as trustworthy to carry out these kinds of activities. There are several policy options we could enact individually or in combination to achieve this.

Standards and Certification
Activities that can benefit the public, but also potentially cause harm — such as operating a nuclear plant, practising medicine or even driving a motor vehicle — are generally accompanied by clear standards that set out what requirements a person or organization must meet to carry out those activities. You can tell by looking at someone’s driver’s licence whether they have been deemed capable of safely operating a tractor-trailer or if they are a new driver who must be accompanied by an adult when they get behind the wheel. And, if they fail to meet those standards — or any other rules of the road — they risk losing their licence.

There are always risks to sharing your personal information, but a system of certification could signal that a person or organization has met the standards of care necessary to be a trustworthy custodian of your data. Canadian law, such as the Privacy Act and the Personal Information Protection and Electronic Documents Act (PIPEDA), already mandates certain privacy standards for private sector organizations. But introducing a tiered rating system, similar to driver’s licences, could be an efficient way of letting customers know what measures a company would use to manage and protect their data in accordance with PIPEDA, without having to read a complicated terms of service agreement. For example, to receive a top rating of A, a company would have to guarantee an audit or supervision capability for the data, meaning it would have a record of exactly which employees have viewed the data and for what purpose. For a lower rating of B or C, the data would be protected by password authorization, but not have an audit trail. If a company chose not to complete the certification process, its lack of a rating in itself would send a strong signal to customers. By making clear exactly how a company would manage and protect personal data, customers would be more likely to trust them with their data.

A tiered rating system could be an efficient way of letting customers know what measures a company would take to manage and protect their data in accordance with PIPEDA, without having to read the complicated terms of a service agreement. (Photo: Hadrian / Shutterstock.com)
A tiered rating system could be an efficient way of letting customers know what measures a company would take to manage and protect their data in accordance with PIPEDA, without having to read the complicated terms of a service agreement. (Photo: Hadrian / Shutterstock.com)

Introducing new certification systems that are compliant with the GDPR, PIPEDA or other privacy regulations could also pave the way for different organizations to share their data for innovative new programs or services, such as the opioid example from the introduction. Organizations that meet a certain standard of trust could be permitted to undertake data integration activities using multiple data sets, and consent could be addressed and incorporated to make this work. And because of the challenges and risks associated with these kinds of initiatives, there would still need to be some kind of framework that regulates which data integration programs would be permitted.

If it works, the Canadian certification system could become a global benchmark for data management. Companies that secure a Canadian A rating could use it to market themselves worldwide as secure and responsible guardians of data.

Trust Standards for Individuals
It is not always natural to put your trust in a company — in real life, it is often easier to put your trust in individual people, not entities. For example, within the Government of Canada, different employees have different levels of security clearance that permit them to access different kinds of information. Just because the Government of Canada has access to your data does not mean that everybody who works for the government has access to it. You might not trust the groundskeeper at a national park with your personal financial information, for instance, as much as you would trust an accountant with the Canada Revenue Agency.

A set of trust standards centred on individuals could work like the Nexus program for trusted travellers, allowing someone who has been screened and passed tests for reliability to have greater access to certain kinds of data. This could be another way of permitting the integration of data from multiple sources, if the person handling the data has been verified as someone who can be trusted not to misuse it. Individuals could also be approved to access multiple spheres of trust, so that if two or more organizations wanted to collaborate on a data project, rather than making the relevant data available to both groups, they could instead share it with a select number of employees who have been deemed trustworthy by both organizations involved.

Other Innovations
Companies and organizations are getting better about asking users for consent before they collect or store their personal information, but a side effect of this is that many people click on consent forms without giving them any thought. Does anyone really remember which information they have agreed to share with which organizations? If there were a data breach affecting a digital service you used, would you know what information about yourself was vulnerable?

While some people are diligent and thorough when reading various consent messaging, research suggests the vast majority click through without reading much or any of it. In other cases, users may have given their consent long ago to share something with a company that seemed insignificant at the time, but their concerns have changed over the years, or the company later shared it with another party without their knowledge. The GDPR is strict about requiring companies to include a privacy notice specifying the third parties with whom they share personal information.

Data trusts, like fiduciary trusts, would maintain data sets and manage the conditions under which the data could be used and shared.

A national data consent registry that included these third parties could be a one-stop reference for people to keep track of all the permissions they have agreed to. When users click on the “I agree” button for a company or organization, the company would then have to record what information the users have agreed to share on a searchable registry. This could easily be implemented on a blockchain to ensure that service users have a record of all the times they have consented to share their data. By tracking which types of information different organizations collect, a consent registry would also allow anyone who wished to build data integration programs to understand which organizations they would need to approach to obtain which data sets.

Another innovative policy option could be to set up data trusts for certain kinds of information that could be put to a socially beneficial use. Data trusts, like fiduciary trusts, would maintain data sets and manage the conditions under which the data could be used and shared. The idea is to make it easier for new start-ups, or anyone who has an idea for using data in the public interest, to access the data they need to make their ideas work, as long as they meet the necessary standards for safeguarding the data.


 

Conclusion

Canada has the opportunity not just to compete with other global markets on data innovation, but to strengthen its leadership role in setting standards for data privacy and security. The benefits could be huge in terms of drawing investment to Canada, building a strong ecosystem for homegrown data companies and developing innovative new ways to use data for the public good. But this can only happen with trust: trust from Canadians that organizations will not misuse or expose their personal information, and trust from companies that want to use data in innovative ways that the investments they make in privacy and security will pay off. There are several ways Canada can foster this environment of trust, but fresh thinking and enhanced outreach will be required to get buy-in from the public and the corporate world.
The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.
  • Paul Vallée

    Paul Vallée co-founded Pythian in 1997 and became CEO of the company in 2005. His passion and foresight for using data and technology to drive business success has helped Pythian become a high-growth global company, with more than 400 employees and offices in North America, Europe and Asia. Paul is a strong proponent of technical excellence as well as diversity in the workplace. Prior to founding Pythian, Paul worked as a data scientist and holds a bachelor of commerce degree in management information systems from the University of Ottawa. He was acknowledged as a “Top 40 under 40” in 2011 by the Ottawa Business Journal in recognition of Pythian’s growth to that time.

Return
to cigi
2017