Fake News Threatens Canada's Federal Election

March 13, 2019
shutterstock_1215465100.jpg
The October election will be the first one following Facebook’s move last year to ensure all advertisements from political entities be labelled with “paid for by” disclosures from the advertiser. (Shutterstock)

Last month, the Canadian Broadcasting Corporation issued an analysis of 9.6 million tweets from Twitter accounts that have since been deleted. The tweets discussed oil pipelines, immigration and refugees, and were suspected to have originated in Russia, Iran and Venezuela.  

Michael Pal, associate professor at the University of Ottawa, contends that the tweets were targeted at hot-button issues intended to influence Canadian opinions and appeared to be a preview of what Canadians can expect in their next federal election, set for October 21.

“This is happening now,” Pal said. “It appeared to be a test run for what they are trying to do for the election.”

According to academics and other observers, foreign influence campaigns, potentially led by Russia, China, Venezuela and Saudi Arabia could influence the election’s outcome, especially considering that Canadian Prime Minister Justin Trudeau could face a potentially tight race.  

Foreign influence campaigns could take to social media to foment discord on hot-button Canadian policy issues. These topics could include immigration, refugees, pipelines and other controversial subjects, any of which could be used to wreak havoc on the election process.  

Foreign influence campaigns aren’t new, but so far they haven’t impacted Canadian elections in a big way. Even so, there are a number of reasons why Canada is likely to be a major target in the lead-up to October. Consider that it will be the first federal election since the Trudeau administration approved the Justice for Victims of Corrupt Foreign Officials Act (often referred to as Canada’s Magnitsky Act) in 2017, which imposes sanctions on Russians and other nations for alleged human rights abuses. In August 2018, tensions between Canada and Saudi Arabia escalated after Canadian Foreign Affairs Minister Chrystia Freeland criticized the country over its human rights record. Then, in December, Canada arrested Meng Wanzhou, chief financial officer of China telecommunications giant Huawei, at the request of the US government, in a move that set off a political firestorm.

“There are a number of reasons why the Canadian federal elections are likely to be a target,” said  Pal. “Canada’s Magnitsky Act is one for sure, but also Trudeau has a global brand, which brings good and bad attention.”

Pal, who researches the law of democracy, comparative constitutional law and election law, notes that elections historically have been targeted by foreign powers, although before the rise of the Internet and social media platforms, their impact was muted.

“The thing that has changed is the internet,” Pal said. “At the same time voters are more polarized and vulnerable than they used to be.”

Pal contends that political parties, journalists and individuals are all potential targets. Political parties may not have the financial resources to provide adequate cyber protection against hacking or other interference, at least not at the same level as big banks or the federal government have.

“[Cyber protection] is a big expense for political parties,” he said.

Election interference could emerge on Twitter, Facebook or other platforms, and could take many different forms, including artificial, so-called “deep fake” videos, which alter the content of a video to make it look like an individual said or did something that they did not do.

“[Deep fake videos] weren’t big in the last election, but they could be this time around. You could eventually disprove the video, but for a while it could influence voters, especially if done in a sophisticated manner to target individuals who don’t spend a lot of time paying attention to every detail of the news,” Pal said.

Americans are very familiar with the existence of foreign influence and misinformation social media campaigns, following their 2016 presidential elections. Pal suggests that the situation could get worse. He pointed to one high-profile situation that emerged in advance of the 2016 US presidential elections, which involved the fake news phenomenon that became known as “Pizzagate,” a conspiracy theory that trended on social media sites, claiming that a Washington-area pizzeria was hiding a child trafficking ring led by Democratic candidate Hillary Clinton and her former campaign chairman John Podesta. Pal argues that the fake news phenomenon could have been bolstered had there been a sophisticated “deep fake” video of Clinton entering the pizza parlour. In addition, so-called bots or cyborg accounts on Twitter could play a role. As Pal describes it, bot accounts might amplify a message and make it look more important than others by retweeting the content multiple times. The goal? To artificially push a subject or an idea past the noise of social media to gain the attention of politicians and the news media.

“If there are nine million tweets about an issue, journalists may want to cover it because it appears to be a big issue,” he said.

Pal contends that human users known as “cyborgs,” who set up the bots, will also engage in online conversation to amplify the automated message and expand a particular disinformation campaign.

The October election will be the first one following Facebook’s move last year to ensure all advertisements from political entities be labelled with “paid for by” disclosures from the advertiser. However, Pal said he questions whether Facebook will devote the resources and really have the capacity to make sure all advertisements are clearly labelled and included in the repositories. And, although foreign nationals (non-Canadians) aren’t permitted to produce advertisements targeting the Canadian elections, social media sites haven’t received the same level of regulatory oversight.

“How effective will Facebook be in ensuring that a non-Canadian entity isn’t making the advertisement?” Pal asked. “Social media platforms can’t take money placed by foreign advertisers for the Canadian election. Facebook needs to monitor that situation.”

It is clear that social media sites like Facebook and Twitter and the content they provide are having a major impact on the democratic process. As such, Colin Koopman, associate professor of philosophy and director of new media and culture at the University of Oregon, argues that ethical sensibilities with a focus on democracy need be built into how the next generation develops and uses new technologies. Specifically, education systems should make major changes to how computer science, entrepreneurship and social media courses are taught, by incorporating the examination of ethics and democracy into existing and new programs.

“Facebook came under the ‘build it first and ask questions later’ approach,” Koopman said. “Education leaders are starting to understand now that you must build questions of democracy into the technology design process. It’s not just about getting the code to run and then taking it upstairs to the business team. We need to ensure that the next generation of basement tech wizards consider ethical issues when they design the next Facebooks.”

Jim Balsillie, the founder and the chair of the board at CIGI, suggested in November that the International Monetary Fund (IMF) and its director, Christine Lagarde, should “catalyze” a new Bretton Woods gathering, with the goal of writing rules to address unprecedented digital challenges facing the world.

Koopman said that it made a lot of sense to have an organization like the IMF lead a global discussion about digital challenges. Ethics and technology education should play a fundamental part of that review, he added.

“The discussion needs to happen and an institution of the IMF’s stature getting involved can only be a plus,” he said. “Developing projects in light of IMF expertise would be a great way to further the research and inquiry we need for these issues.”

But, for now, an IMF Bretton Woods gathering – and any efforts to infuse ethics into technology development – would just be the beginning of a larger and lengthy debate about how to address challenges posed to the democratic process emerging from big data, machine learning and artificial intelligence.  

In the near term, Canadians face a more pressing and immediate challenge as they prepare to cast their vote in an election set for just seven months away, with expectations for a significant rise in deep fake videos, cyborgs, bots and the other inventive disinformation campaigns to come.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Ronald Orol is a senior editor at The Deal and writes about hedge funds and bank and securities regulation for The Street. He is the author of the book Extreme Value Hedging: How Activist Hedge Fund Managers Are Taking on the World, and holds a master's degree in business and economics journalism from Boston University.