There’s a Fix to Disinformation: Make Social Media Algorithms Transparent

In the period of newspaper and broadcast dominance, anyone would know what news their neighbours were consuming just by opening the paper or turning on the TV.

March 15, 2022
tank2022-03-09T030429Z_1_LYNXMPEI2804N_RTROPTP_4_UKRAINE-CRISIS-SUMY.png
Remains of a charred Russian tank are seen in this photo taken in Ukraine’s Sumy region, March 7, 2022. (Irina Rybakova/Ukrainian Ground Forces Handout via REUTERS)

On February 27, 2022, when Vladimir Putin told the world that he had put Russia’s nuclear arsenal to high alert, Valery Gerasimov was seated at the opposite end of a long table from Putin, his face unreadable.

Gerasimov, who had been a commander in the Chechen war that solidified Putin’s grasp on power, has been Russia’s top general since 2012. He successfully led Russia’s Syria campaign, propping up blood-soaked dictator Bashar al-Assad, and has presided over small border wars, where Russia creates chaos and then sends in “peacekeepers” to establish “frozen conflicts” on its periphery.

Western analysts credit him with the “Gerasimov doctrine” — the idea of blurring the lines between military and non-military methods, seeking to “perfect activities in the information space.”

His authorship of the doctrine is disputed, but nobody can doubt that is what Russia is doing in the current conflict — using disinformation and propaganda to weaken its enemies, seeking advantage by inducing chaos and uncertainty. 

In the summer before the 2016 US election, GRU, the Russian military intelligence service, hacked the Democratic National Committee’s servers and the email account of Hillary Clinton’s campaign manager, made Clinton’s emails public and then used sophisticated social media techniques to help Donald Trump beat her in the race that fall. 

The Facebook campaigns were largely aimed at deterring the African American vote, using fake Black Lives Matter groups and employing “dark marketing” techniques, sending unpublished targeted posts directly to voters through Facebook. Facebook denied the problem and then sought to minimize its impact, until it was forced by American law makers to reveal its role.

Last month, Grid reported that the main Facebook groups promoting the Canadian truckers’ “freedom convoy” were managed by a Bangladeshi marketing firm. Who was actually directing that operation? 

It would be wrong to conclude that it was the Russians, but it would be better to know than to guess. After everything that happened in 2016, it is apparently still easy for foreign actors to use sock-puppet accounts to sow discord in our societies.

The Verge has reported that the convoy movement really started to take off after it was promoted by American Conservative outlets that had their own reasons to want to knock Prime Minister Justin Trudeau down a peg or two. The Russians must have been delighted, because coverage of the Canadian disruption — which was featured heavily on Fox News and Russia Today — helped distract from Putin’s troop buildup on the border with Ukraine. And when the Russian tanks rolled in, Putin-friendly Americans tried to draw a connection between Canada’s convoy crackdown and Russia’s invasion.

In the period of newspaper and broadcast dominance, anyone would know what news their neighbours were consuming just by opening the paper or turning on the TV. In the social media era, many of those interactions are dark, like the voter-suppression ads that Russians targeted at African American voters. Too many of our neighbours are being fed dangerous conspiracy theories, which lead to violence and disruption, and we can’t see who is putting this content in their feeds.

For Canada, this blindness to the forces at work in our marketplace of ideas poses special problems. A McGill-led study found that the platforms are “saturating information streams with U.S.-based news.” But researchers don’t know whether that’s because “Canadians care deeply about all news coming from the United States or because the platform itself elevates the importance of this conversation.”

The solution is algorithmic transparency. But achieving this is easier said than done, because the algorithms are the special sauce in the platforms.

Facebook whistle-blower Frances Haugen has advocated for a new kind of regulator that is empowered to pierce the veil of secrecy. So far, law makers have listened politely but not acted. That makes us blind to the threat.

Consider Pat King, a convoy organizer in jail in Canada. In his livestreams, King used racist language, discussed white replacement theory, and opined that Justin Trudeau was “going to catch a bullet.” 

Canadian national security expert Stephanie Carvin emailed a senior contact at Facebook at the end of January to raise a concern about comments that King was making about anti-hate campaigners, which she deemed threats. At the time, King had 200,000 Facebook followers.

Her contact replied that Facebook was aware of his page and was balancing freedom of expression and safety. King’s page is still up. He now has 325,000 followers.

Carvin thinks we need to know more about how he got so popular.

“Who was Pat King promoted to and on what grounds?” she asks.

We have no idea, because the algorithms are secret.

Since Russian tanks rolled into Ukraine, it has become clear that many Western countries have neglected their security and must rethink it in a hurry. Given the Gerasimov doctrine, and the platforms’ record in dealing with disinformation and foreign threats, it is time to consider algorithmic transparency as part of our national defence.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Stephen Maher is a Harvard Nieman Fellow and a contributing editor at Maclean’s.