Who owns the disaggregated data that is collected about various communities? (Shutterstock)

As part of its anti-racism legislation, the Ontario government committed to implementing “race data collection and an anti-racism impact assessment tools, to help identify, remedy or prevent inequitable racial impacts of policies and programs.” Its objective is a noble one: “to help identify and monitor systemic racism and racial disparities within the public sector.”

And on the face of it, the Anti-Racism Directorate’s recently released policies on data collection appear to encompass all the principles one would hope to see. The policy seems to offer careful consideration of how state agencies should collect race-based data about specific community interactions and experiences with public institutions.

While the data won’t be collected for some time yet and the obligations will be phased in over five years, a fundamental question remains — who owns the disaggregated data that is collected about various communities?

Members of the Black community continue to call for the destruction of data collected through the controversial practice known as carding, in which police randomly stop people — disproportionately individuals in the Black community — and ask for identification. Some politicians have already agreed the data should be destroyed.

But on the flipside, that very same data is crucial to the Ontario Human Rights Commission’s effort to shine a light on racial profiling. Its Chief Commissioner, Renu Mandhane, recently spoke about the need for data to investigate the disparate treatment that certain communities experience. Pointing to a three-year study of Ottawa police traffic stops, for instance, Mandhane explained that the disproportionate traffic stops targeting Middle Eastern men and women, and Black men, reinforced the qualitative experiences shared by members of these communities.

“The data spoke to the experiences that people were reporting,” she shared during a panel discussion at RightsCon, a global conference on human rights in the digital world, held earlier this month in Toronto. “The data can help demonstrate the lived experiences of communities.”

And yet, there is a general lack of understanding around the critical nature of data, and data ownership among government institutions, argued Bianca Wylie, a senior fellow at the Centre for International Governance Innovation and the head of the Open Data Institute of Toronto, during the same panel with Mandhane.

“Citizens need to be more actively engaged and advocating for public ownership of data,” Wylie said.

She expressed concern that the same data used to address systemic inequalities may be used against the communities that experience systemic inequality.

“We need to talk about data justice, not just the rights of ownership, but the right to not have data collected at all,” explained Wylie, pointing out that both individuals and communities should have more say.  

Indigenous communities — historically harmed by the data gathering done by state institutions — aren’t waiting for governments to create the necessary frameworks, shared Justin Wiebe, a capacity building specialist with the Trillium Foundation, and the panel’s moderator. In fact, there is already movement toward Indigenous data sovereignty, as outlined in a 2017 paper released by Open North, in collaboration with the British Columbia First Nations Data Governance Initiative, said Wiebe. As part of a project called Decolonizing Data, First Nations communities provided input into the creation of 10 key principles around data sovereignty that could serve as a model for other communities as well.

“We’re not going to advise you on what you should be doing with your data. We’re going to tell you what we’re going to do with our data,” reads a quote in the report from Gwen Philipps, a citizen of the Ktunaxa Nations.

The tensions around data usage are made clear in the preface of the 2016 book, Indigenous Data Sovereignty: Toward an Agenda. “The emergence of the global data revolution and associated new technologies can be a double-edged sword for indigenous peoples,” it reads. If communities “lose control” over their data, “discrimination will persist.”

Questions over ownership will continue to emerge in smart cities, added Wylie, pointing out that citizens in these new urban spaces still lack clarity on how either personal or disaggregated data may be used. “Is this the new horizon of racial profiling?”

It’s a valid question in an age where data is featuring more and more prominently in how our institutions function. Consider that several Canadian cities, including OttawaVancouver, and Edmonton, are already involved in predictive policing, which allows police to redistribute services based on data they gather about various communities.

A recent article in The Walrus by John Lorinc explores how the data collected through carding, for example, may be used by police services to further profile communities. Lorinc quotes Andrew Ferguson, the author of The Rise of Big Data Policing; he explained that “Initial predictive-policing projects have raised the question of whether this data-driven focus serves merely to enable, or even justify, a high-tech version of racial profiling.”

As an increasing number of data points are collected and used for machine learning — a form of artificial intelligence (AI) that relies on datasets to make inferences — the potential for human rights violations grows in tandem.

This was the impetus behind the drafting of “The Toronto Declaration: Protecting the Rights to Equality and Non-discrimination in Machine Learning Systems,” released at RightsCon by AccessNow and Amnesty International.

“We acknowledge the potential for these technologies to be used for good and to promote human rights but also the potential to intentionally or inadvertently discriminate against individuals or groups of people,” reads the preamble. “We must keep our focus on how these technologies will affect individual human beings and human rights. In a world of machine learning systems, who will bear accountability for harming human rights?”

Canada’s federal government is also tackling these issues. The Digital Inclusion Lab, which operates out of Global Affairs Canada, shared a working paper at RightsCon titled, “Artificial Intelligence and Human Rights: Towards a Canadian Foreign Policy.” The document, which has yet to be formally released, tackles the issue of equality and bias within machine learning, among other relevant issues. The paper recommends funding for “de-biasing research”, the establishment of an independent observatory to “evaluate AI’s effects on human rights and on gender equality,” and the promotion of digital literacy.

These discussions, however, can’t only be held at the federal level.

Indigenous communities in Canada seem to be far ahead of other groups in staking their claim to their own data. It’s time to follow their lead and begin the important work of ensuring that every community is participating in these crucial conversations in order to safeguard both individual and collective human rights. But data governance is a complex issue that is out of reach for many; public and private institutions need to create the opportunities for these necessary discussions.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Amira Elghawaby is a journalist and human rights advocate. She convened the panel Whose story is it anyway? Decoding the Collection & Sharing of Information at RightsCon.