Canadian Elections Can’t Side-step Social Media Influence

November 20, 2018
shutterstock_1065394748.jpg
Facebook (among other media giants) is under scrutiny for keeping the public in the dark about the algorithms and systems in place to influence advertising, content and even elections. (Shutterstock).

The next Canadian federal election is set to be historic and significant, albeit not for the best of reasons. The technologies and the tone of election campaigning are changing rapidly, and a growing chorus of researchers are warning that democracy itself is at stake.

Prime Minister Justin Trudeau recently said he thinks “we are now looking at perhaps what will be the most divisive and negative and nasty political campaign in Canada’s history.”

The likely divisiveness won’t simply be a product of animosity between political parties, however.

In an interview with the Canadian Press, Defence Minister Harjit Sajjan expressed the firm belief that the next federal election would be targeted by Russian intelligence agencies, and that Canadians would be subject to cyber attacks and fake news. Sajjan indicated that the Ministry of Defence was regarding the protection of Canada’s election(s) as a serious and pressing issue.

As it should — the changing nature of the political arena (online and off) and the active participation of third parties and foreign actors will play significant roles in Canada’s next election.

The Shifting Shape of Fake News

Robert Lewis, retired journalist and author of Power, Prime Ministers and the Press: The Battle for Truth on Parliament Hill, argues that the use of fake news in Canada is as old as Canada itself. However, for Lewis, what’s different today is the shift in size, scope and potential impact of fake news’ presence.

At the heart of this shift is the increasing refinement of targeted advertising and the algorithms that enable both personalized advertising and the dispersion of fake news with remarkable accuracy and customization.

In an email interview, Sara Bannerman, the Canada Research Chair in Communication Policy and Governance at McMaster University, described the consequences of these innovations in advertising.

On one hand, targeted messaging is similar to the practice of advertising to particular segments of the population in community publications. On the other hand, targeted messaging is completely different because it takes place in ‘the dark’; targeted ads are called ‘dark ads’ because they’re visible only to specific selected people and not to a broader public,” she said.

Experts such as Bannerman can provide a sense of what is possible, but the reality of what is happening is less clear. The increased role of third parties and foreign actors calls for improved oversight.

Elizabeth Dubois, an assistant professor at the University of Ottawa, highlights the importance of proper oversight. “First, it is not just political parties that can engage in targeted advertising, which can make it harder to track. We need to be able to track it so that we can be sure things like spending limits are respected and enforced. Second, targeted advertising can be based on data which is unfairly collected and could be a breach of privacy,” she said in an email.

This notion — that targeted advertising could be based on data harvested through a privacy breach — is not theoretical, as the Facebook-Cambridge Analytica data scandal has taught us. Governments are still in the process of investigating the facts of that event, and committee chairs from the UK, Canadian, Irish, Australian and Argentinian parliaments held a joint hearing in hopes that Facebook CEO Mark Zuckerberg would appear. It should come as no surprise that he declined.

Targeted ads alone, of course, aren’t the problem. Their role in shaping discourse, however, limits democracy.

“This takes democratic discourse out of the public sphere and into private secluded spaces where messages can’t be countered, debated, or challenged,” Bannerman said. “This opens possibilities for manipulation, including the ability to send contradictory messages to different groups, to misinform, or to attempt to dissuade people from voting. It permits psychometric, geographic, and behavioural targeting to take place without oversight.”

Why Keep Society in the Dark?

The primary reason that governments and societies are still in the dark about the algorithms shaping election advertising, content and overall influence is the active effort by Facebook (and other social media giants) to avoid oversight, scrutiny and, ultimately, government regulation.

A recent investigation by The New York Times revealed not only that Facebook knew about Russian interference in the US election far earlier than previously disclosed, but that the company later took extensive measures to suppress criticism, discredit critics and avoid any responsibility for what is happening on their platform, whether Russian interference, or the proliferation of fake news. Ironically, this action even included hiring a firm to create fake news about organizations critical of Facebook.

Government regulation of Facebook is becoming increasingly necessary, yet there remains a larger issue of digital competence among policy makers.

In his book Prototype Politics, Daniel Kreiss, an associate professor at the University of North Carolina School of Media and Journalism, writes that “candidates no longer understand, let alone control, the technical systems that help get them elected.” In an email interview, Kreiss elaborated: “This has a number of implications from ethical concerns (candidates no longer know the data their campaigns possess, the targeting they engage in, or the security systems they have in place), to infrastructural issues (i.e.: as companies such as Facebook are taking on an increasingly important role in politics, what campaigns do is being outsourced to private, commercial firms).”

Kreiss and Shannon McGregor elaborate on this point in their recent paper examining how technology firms shape political communication by actively collaborating with political campaigns and embedding staff within campaign offices, directly shaping what campaigns do and say. In a drive to increase revenue and gain political influence, technology platforms are embracing their role as political platforms, while still shirking their responsibility for hosting the public commons.

Eventually, however, as policy makers and the public better understand these platforms’ influence and its impacts, so will the desire for oversight and rules.

An Educated Public Will Call for Improved Regulation

In an attempt to anticipate and potentially circumvent such government regulation, Facebook has enacted a program of political ad transparency. It provides information about advertisements on the platform, discloses who paid for the ad and provides access to an archive of all ads run on Facebook. Facebook is making increased efforts to verify the identity and location of those purchasing (political) ads.

The company’s goal was to have this program fully operational in time for the recent US mid-term elections. Unsurprisingly, there were significant glitches, including the ability, as demonstrated by Vice News, to buy ads that appear as “paid for” without proper verification by Facebook.

Facebook’s clumsy attempts at transparency may not be enough to avoid legislation that mandates and regulates similar oversight. The Honest Ads Act was introduced in the US Senate as a means of mandating that digital platforms maintain an archive of all political ads. It has faced an uphill battle, including significant resistance by Facebook, and while it was originally bipartisan, the Republican sponsor, Senator John McCain, has since died.

The Canadian equivalent, Bill C-76, the Elections Modernization Act, also addresses increased transparency, but does not go far enough when it comes to compelling platforms to archive political communications.

Sara Bannerman argues that the targeting strategy of the ad purchaser should be made available as part of the database. “If so, this would permit public oversight of targeting strategies, which is important in understanding the ads’ messages in light of their intended audiences, and in opening up the possibilities of counter-messaging. Canada’s Bill C-76 would require an ad archive, but would not require information about targeting to be included.”

Elizabeth Dubois goes further in arguing that “we need to ensure that platforms are compelled to build in tools and techniques for identifying when things like voter suppression or illegal advertisements are happening on their platforms. We need to ensure Elections Canada has the resources it needs to enforce regulations. Political parties need to be compelled to respect the privacy of Canadians when it comes to using their personal data, which likely means adjustments to PIPEDA [Personal Information Protection and Electronic Documents Act]. We also need media and digital literacy to be built into not just children’s education but our everyday lives through things like changes to the very design of platforms and journalistic coverage.”

More Data, Less Laissez-Faire

To their credit, the Canadian government and its respective agencies are not sitting idle but are pursuing some of these challenges. For example, Elections Canada has issued a request for proposals looking for a tool to help them monitor social media posts and watch for nefarious activity. They’re also seeking specialists to comb through all that data.

Canadian intelligence agencies such as the Canadian Security Intelligence Service and the Canadian Centre for Cyber Security are actively investigating and defending against disinformation and threats to Canadian elections, and the Canadian Armed Forces are actively focusing on countering the weaponization of information and its threat to Western democracies.

The threats are real and the regulations currently in place are neither adequate nor capable. There remains a lack of knowledge of what is happening or what could happen. What is clear, however, as CIGI board member Taylor Owen has argued, is that democracy itself is under threat.

When asked what’s required to prevent such a catastrophe, Owen said that, as “in the lead-up to the 2008 financial crisis, a laissez-faire governance approach emboldened private companies to seek economic growth while passing the negative costs on to society. We are in a similar moment. This time, social media companies have built tremendous wealth from the attention economy while passing the costs onto our public sphere and democracy. It is time for democratically elected government to govern, through a new data rights regime, modernized competition policy, oversight of micro-targeted advertising, auditing of artificial intelligence, and better support for the reliable information needed in a democracy.”

Elizabeth Dubois also noted the cross-disciplinary approach required to address the current challenge posed by innovations in targeted advertising: “We need more access to data about how this targeted advertising works, who uses it and why. The issues are also mixed up with concerns over disinformation, automation and even artificial intelligence, which each have their own set of issues, concerns and research needs. In Canada we tend to rely on data from other Western democracies because there is a lack of data about Canadian media habits and preferences, and more research is needed into how Canadians actually feel about their data being used in political contexts.”

A larger question is at hand: does the world — or, in this case, Canada — require a crisis to embrace change, or can changes be made before the crisis arrives? Has targeted advertising permanently eroded the public sphere, or can it be carved out of the new media environment Canada now finds itself in?

We’re likely to find out in the weeks and months leading up to the next federal election.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Jesse Hirsh is a researcher, artist and public speaker based in Lanark County, Ontario. His research interests focus largely on the intersection of technology and politics, in particular artificial intelligence and democracy. He recently completed an M.A. at Ryerson University on algorithmic media.