Canada Is Gathering Public Input on Copyright Implications of AI, Internet of Things

August 13, 2021
shutterstock_1717788955_s.jpg
(Shutterstock)

In the Byzantine world of copyright law, Barry Sookman is arguably Canada’s top expert in tracking developments and forecasting changes needed to keep its legislation abreast of evolving technology.

He welcomed Parliament’s modernization of its Copyright Act in 2012 to finally bring it into the digital age, with amendments to, in part, “update the rights and protections of copyright owners to better address the challenges and opportunities of the Internet, so as to be in line with international standards; clarify Internet service providers’ liability and make the enabling of online copyright infringement itself an infringement of copyright; permit businesses, educators and libraries to make greater use of copyright material in digital form; [and] ensure that it remains technologically neutral.”

Almost a decade later, the federal government is in the midst of another update of Canada’s copyright law so as to reflect the proliferation of works created by artificial intelligence (AI), such as computer-generated musical compositions or news stories, and of software-enabled devices, such as cars and home appliances, connected to the Internet of Things (IoT).

Last month, the government launched a public consultation to gather comments on such issues as text and data mining; authorship and ownership of AI-created works; infringement and liability regarding AI; and repair and interoperability issues surrounding technological protection measures (TPMs — also called digital locks or digital rights management).

“One of the issues around artificial intelligence is that it takes a tremendous amount of data to train algorithms to copy and process works of all kinds. This has raised questions about whether or not the permission of rights-holders is needed,” said Sookman, a senior counsel with McCarthy Tétrault LLP in Toronto, who also regularly blogs about copyright issues, in an interview.

He explained that Google, for instance — whose Google Translate service provides real-time translation through its neural machine translation engine that searches the patterns of language in millions of documents online — has sought further certainty in terms of its rights to access copyrighted material. However, Sookman highlighted that Google and other companies using AI might already be covered by the fair-dealing provision in the Copyright Act, which allows use of copyrighted materials for the purpose of research or study.

The Supreme Court of Canada also addressed the fair-dealing exemption in its ruling in CCH Canadian Ltd. v. Law Society of Upper Canada, a case in which Sookman served as co-counsel to two interveners (the Canadian Publishers’ Council and the Association of Canadian Publishers).

However, the questions surrounding who owns AI-generated works are complicated.

In its 2019 Statutory Review of the Copyright Act, the House of Commons Standing Committee on Industry, Science and Technology (INDU) recommended adapting the legislation “to distinguish works made by humans with the help of AI-software from works created by AI without human intervention.”

But making these distinctions can be difficult, as Sookman pointed out: what if an AI program “without human intervention” gathered, say, Bruce Springsteen’s body of work to create a new recording? Could Springsteen, or his record label or music publisher, claim that the AI-produced work constituted copyright infringement?

As the federal government outlined in its consultation paper released on July 16, witnesses participating in the parliamentary review of the copyright law provided different approaches that the government could take regarding AI authorship. Some suggested “that works created autonomously by AI, without exercise of skill and judgment on the part of a human, should not qualify for copyright protection and should fall in the public domain”; others described “a need for the law to better recognize the variety of ways that AI is used to create works or other subject matter” so as to be able to judge whether the human employing the technology exercised “sufficient skill and judgment to produce an original work using AI” and therefore to be considered its author.

In a blog post he wrote on AI and copyright four years ago, Michael Geist, who holds the Canada Research Chair in Internet and E-commerce Law at the University of Ottawa’s Faculty of Law, noted that restrictive copyright rules “may limit the data sets that can used for machine learning purposes, resulting in fewer pictures to scan, videos to watch or text to analyze. Given the absence of a clear rule to permit machine learning in Canadian copyright law (often called a text and data mining exception), our legal framework trails behind other countries that have reduced risks associated with using data sets in AI activities.”

In the world of IoT, a significant copyright issue involves TPMs, which are used to control access to content in a digital format.

The 2012 amendments to Canada’s copyright law — the Copyright Modernization Act — included legal protection for TPMs, which, for example, allow the creative industry to put these digital locks on DVDs or video games to prevent piracy.

But in their 2019 statutory review of the Copyright Act, INDU committee members heard from stakeholders regarding IoT that TPMs are, as the government’s recent consultation paper summarized, “too restrictive and prohibit legitimate non-infringing activities,” such as the “right to repair,” as advanced by the Consumer Technology Association (CTA).

In a 2018 appearance before the committee, Michael Petricone, senior vice-president of government affairs with the CTA, said that “lawful user exceptions are particularly necessary in the diagnosis, maintenance and repair of modern cars, farm equipment and other devices, because embedded software has replaced analog circuitry in mechanical parts.”

Sookman explained that section 41.21 (1) of the Copyright Modernization Act allows the government to make regulations overriding the prohibition on circumventing a TPM if the prohibition, as the law states, “would unduly restrict competition in the aftermarket sector in which the technological protection measure is used.”

He explained that the divide over TPMs is between those who believe there should be broader exceptions to foster innovation by allowing interfacing between computers or software programs, and companies that spend considerable time and money developing those interfaces and that wish to maintain licensing control over them.

In April, the United States Supreme Court released a decision that addressed whether copyright protects application programming interfaces, or APIs, “which could have implications for the legal protection of these interfaces used in the Internet of Things — if followed in Canada,” explained Sookman.

In Google LLC v. Oracle America Inc., the US high court held that Google’s copying of Oracle’s proprietary Java SE API, “which included only those lines of code that were needed to allow programmers to put their accrued talents to work in a new and transformative program, was a fair use of that material as a matter of law.”

“The inquiry into the ‘the purpose and character’ of the use turns in large measure on whether the copying at issue was ‘transformative, i.e., whether it ‘adds something new, with a further purpose or different character,’” wrote Justice Stephen Breyer in delivering the court’s opinion, in which he also referred to the 1994 US Supreme Court copyright ruling Campbell v. Acuff-Rose Music, Inc.

Google’s limited copying of the API is a transformative use [and] copied only what was needed to allow programmers to work in a different computing environment without discarding a portion of a familiar programming language. Google’s purpose was to create a different task-related system for a different computing environment (smartphones) and to create a platform — the Android platform — that would help achieve and popularize that objective. The record demonstrates numerous ways in which reimplementing an interface can further the development of computer programs. Google’s purpose was therefore consistent with that creative progress that is the basic constitutional objective of copyright itself.

Canada’s top court has yet to weigh in on the legal protection for interfaces.

Beyond their function of protecting copyright, TPMs also provide a security role in the world of IoT, where vehicles and household appliances are connected over the internet or other digital networks, as Sookman explained.

“Think of a control system in a modern automobile, like a Tesla. There are good reasons to have measures to prevent hacking, not only of cars, but also of home-security systems, fridges, washing machines — anything connected to the Internet of Things,” he said.

“If there is an exception to be able to hack a TPM and it’s not replaced with something that provides protection for security devices, we’re creating an issue that could put the health and safety of Canadians at risk and one in which Public Safety Canada should be involved.

“It used to be that you could solve an issue with one department. But now we’re getting into areas that span multiple interests,” explained Sookman, also the former co-chair of the technology law group and past head of the internet and electronic-commerce group at McCarthy Tétrault.

“In the Internet of Things, automated cars are part of a network that can talk to one another and with devices that control them. That involves Transport Canada and, with TPMs, Public Safety Canada. Copyright comes in because a lot of that [communication] is done via processing [through] computer programs that are part of [proprietary] on-board navigation systems.”

However, he said that amending the Copyright Act to specifically address IOT-related issues might not be necessary, since they could be dealt with under the fair-dealing exception and guided by jurisprudence from the Supreme Court of Canada, which has “espoused” user rights — most recently in its unanimous ruling in York University v. Canadian Copyright Licensing Agency (Access Copyright). In that decision, the court held that the copyright collective cannot enforce a tariff for royalty payments of published works used on campus after York opted out of the terms of a licence with Access Copyright.

“Governments around the world have not uniformly moved on exceptions to circumvent TPMs on IoT, so there is no common standard on how this is dealt with,” explained Sookman, a former member of the CIGI advisory board. “It’s a little bit risky for a small country like Canada to impose new standards, because we could end up going too far in imposing restrictions that hamper innovation in IoT. So I think it’s a wait-and-see until there is more of a consensus among our trading partners about how things should work.”

Retired Toronto technology lawyer and active innovation law-and-policy blogger Richard Owens said in an interview that he believes Canada’s copyright law has “the conceptual and most of the legal tools necessary to resolve any issues that come up.

“I think the IoT issues are trivial and non-existent, and the AI issues are marginally more interesting. But I’m not sure they require legislative attention — and if they do, it won’t take much.”

Owens said that IoT essentially involves accessing data from devices connected on a network — “and there is no copyright in data.”

He explained that any copyright issues involved are not directly related to IoT, and are associated with authorship of the software programs used and, perhaps, the arrangement of the data — both of which are already addressed under the Copyright Act.

However, authorship regarding AI is a “big issue,” according to Owens, who taught courses on technology and intellectual property at the University of Toronto’s law faculty and served as executive director of its Centre for Innovation Law and Policy.

“We know copyright can subsist in the computer software that programs the artificial intelligence, but what happens for things that are created by an artificial intelligence engine — such as music and even poetry accepted for publication?

“So we might want to amend the Copyright Act so that a work created by artificial intelligence is owned by the maker, who is the person who employed the AI engine and configured it to create the work — which is a common principle under the law,” said Owens, who is himself a poet and a photographer.

But, as he added, that does not necessarily mean recognizing AI-related software as the “author” of a work — a designation that Owens believes reflects the “dignity of creation by the human spirit.”

“In fact, AI works that are sold should be clearly disclosed as machine-created and not in competition with works created by human beings,” he said.

For Owens, though, the more pressing matter is the one addressed by the Supreme Court of Canada in the York University-Access Copyright case and involves “stealing works by educational institutions.”

“There is an exception in the Copyright Act that was added among the 2012 amendments for educational use that has resulted in hundreds of millions of dollars of stolen work in Canada by the educational sector. And it is a crying need that has to be addressed because writers across the country are going bankrupt and publishers are closing shop because they can’t get fairly remunerated for the work,” said Owens.

He has written about the issue and said that he has spent years “begging” the federal government to address this imbalance.

“The government has basically said to Access Copyright members that ‘we don’t care about you; we don’t respect creation in Canada; and we’re going to let the educational sector continue to run roughshod over your rights,’” said Owens, a Munk Senior Fellow at the Ottawa-based Macdonald-Laurier Institute.

In a July 30 news release from Access Copyright, issued on the day the country’s top court released its ruling, the collective says that its case against York University “was about remedying the significant and sustained economic harm to creators and publishers caused by the mass, systemic and systematic copying of their works without compensation by the education sector under self-defined fair dealing guidelines….

“After almost 10 years of litigation and economic harm to the writing, visual arts and publishing sector, creators are still left fighting for fair compensation for the use of their works by educational institutions.”

But, for the federal government, the focus is on the future of copyright through the lenses of AI and IoT, with a relatively short window for gathering ideas.

Innovation, Science and Economic Development Canada (ISEDC) has set September 17 as a deadline for the submission of comments on the AI-IoT consultation, a date expected to fall within the federal election campaign.

Regardless of the outcome of that national vote, Sookman expects that bureaucrats at both ISEDC and Canadian Heritage — the other federal department involved in copyright issues on the creative side — will determine whether the Copyright Act needs to be amended based on the public feedback regarding AI and IoT.

As he explained: “The government has tended to enact laws that are technologically neutral to avoid having to amend them all the time.”

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Ottawa-based journalist Christopher Guly is a member of the Canadian Parliamentary Press Gallery and has reported on copyright issues for Cartt.ca.