Businessman on blurred background using law protection right 3D rendering. (Shutterstock)
Businessman on blurred background using law protection right 3D rendering. (Shutterstock)

Late last year, Amsterdam, Barcelona and New York launched the Cities Coalition for Digital Rights, a “joint initiative to promote and track progress in protecting residents’ and visitors’ digital rights in cities.”

What are digital rights? This coalition considers them a range of protections regarding access to the Internet, privacy, transparency regarding how data is used, control over how data is used, democratic participation in municipal technology decisions and more.

Viewed through this lens, Canada’s largest city wasn’t ready for the political mess that came with Google’s Sidewalk Toronto. People connected to the project, led by Waterfront Toronto, continue to recklessly opine that defining digital rights policy with house-on-fire urgency, created by a vendor relationship no less, is a good idea. It isn’t. The issues related to human rights in digital spaces are complex.

Digital rights protections are one way that residents, globally, are getting engaged in the design of the spaces they live in. Toronto needn’t go beyond its national borders for advice; Montreal is well down this track already. Basic protections and democratically informed policy are critical to put city governments and residents in the driver’s seat with tech companies. Through these policies, cities (and their long-term maintenance plans), are shaped by city and resident needs and wants, not corporate whims.

The Cities Coalition for Digital Rights “marks the first time that cities have come together to protect digital rights on a global level.” But the human rights issues related to how people’s data can be used, and is being used, are far from new. People, racialized people in particular, have been profiled and discriminated against through the use of data for centuries.

Big and small tech companies alike continue to impact public and private spaces by introducing a never-ending flow of surveillance-based products and services. Smart everything, from cars to cities to homes and toys, create a persistent stream of data about our behaviours, some which can be used for policy creation and some for targeted marketing and advertising.

Our data is being used and bought by a range of actors, from law enforcement to government, from banks to insurers to potential employers. It’s not all good or bad but it’s raising a range of thorny issues, including large ethical and moral questions related to keeping options available for an untracked life.

There is a heightened need to be intentional about maintaining elements of human existence that are unknown, unsellable and unable to be used commercially or prejudicially by anyone. Given this context, data related to humans and human behaviour, such as health data, data about usage of urban spaces and facilities, education data, and the rights that should be protected around its usage need policy attention. Not everything about us is for sale or commodification, nor should it be.

Taking control of data that captures information about our behaviour is not only about privacy; though privacy is no small thing. Data is also a critical input to public policy and public service provision. The rules we need to update around the collection and use of data are about power and freedom, both as residents and as nations.

While “data rights” are a helpful way to think about adapting and expanding existing rights frameworks regarding how data can be used, it’s also time to bring consumer protection to the fore in these conversations. How and where should the idea of consumer protection sit in the multi-variable policy issue that is the governance of technology and our data?

Over the last three decades, government, and many tech corporations, have not done the work necessary to attain social licence for their actions. They skipped a fundamental part of consumer protection; that people require education and information to make informed choices.

This is not a suggestion to make all new or potentially problematic things illegal or to destroy creative innovation. It’s a call to return to the necessary interrogation of consequences and impacts on our lives as consumers of these data products, as residents and as cities. Regulation does not kill innovation. It enables it by constructing guardrails for all of us to operate in safely, entrepreneurs included.

Control of our data is a public health and safety issue — it impacts our health and well-being. We are stumbling into the completion of a surveillance state in the name of innovation and efficiency. The window to revert and reassert our rights to freedom, control and choice regarding the use of our data is small and shrinking, but for now, it’s still there.

This article originally appeared in the Financial Post.

Thematics
Program
The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.
  • Bianca Wylie

    Bianca Wylie is a CIGI senior fellow. Her main areas of interest are procurement and public sector technology. Beyond her role at CIGI, Bianca leads work on public sector technology policy for Canada at Dgen Network and is the co-founder of Tech Reset Canada. Her work at CIGI focuses on examining Canadian data and technology policy decisions and their alignment with democratically informed policy and consumer protection.

Return
to cigi
2017