Are Platforms Suppressing Evidence of Social Harms? Corporate History Suggests an Answer

If we can’t define misinformation, as Meta asserts, how can we expect the company to spend money tracking or combatting it?

December 20, 2021
Unknown.png
Illustration by Paul Lachine

Andrew Bosworth, who is about to become the chief technology officer of Meta — leading the company’s enormous investment in augmented reality — reminded me of a tobacco executive when I listened to his recent interview with Axios’s Ina Fried.

In response to a series of probing questions about COVID-19 misinformation, Bosworth framed it as an issue of personal responsibility. Facebook (now known as Meta) has provided a “fundamentally democratic technology,” he said, and elites should not determine how people use their freedom of speech.

“Our ability to know what is misinformation is itself in question, and I think reasonably so, so I’m very uncomfortable with the idea that we possess enough fundamental rightness, even in our most scientific centres of study, to exercise that kind of power on a citizen, another human, and what they want to say, who they want to listen to.”

To put his argument in context, it is necessary to first point out that researchers have shown the worldwide COVID-19 pandemic is enduring because of vaccine resistance that is largely the product of misinformation on social media. The “infodemic” is preventing large numbers of people from getting vaccinated. That is, to a large extent, the result of misinformation from the United States, mostly from conspiratorial right wingers and alternative health hucksters. 

Bosworth shrugs off any responsibility for that without expressing the concern that the company typically expresses in more guarded interactions.

But, if we cannot define misinformation, as he asserts, how can we expect Meta to spend money tracking it? Note that money spent counteracting COVID-19 misinformation is money the company can’t spend on the augmented reality research, an area Mark Zuckerberg is betting on as the future of the company.

In this, Bosworth is echoing two tactics that the tobacco industry used successfully for decades to push back against would-be regulators, who wanted to reduce the needless deaths and suffering that were products of tobacco addiction. 

In the 1960s, as a scientific consensus hardened around the view that smoking caused cancer, an unidentified tobacco executive wrote a memo that came to light years later, in which he laid out the industry’s strategy for pushing back:

Doubt is our product since it is the best means of competing with the “body of fact” that exists in the mind of the general public. It is also the means of establishing a controversy. Within the business we recognize that a controversy exists. However, with the general public the consensus is that cigarettes are in some way harmful to the health. If we are successful in establishing a controversy at the public level, then there is an opportunity to put across the real facts about smoking and health. Doubt is also the limit of our “product.” Unfortunately, we cannot take a position directly opposing the anti-cigarette forces and say that cigarettes are a contributor to good health.

The industry pushed back hard for years by funding health research on a grand scale, cherry-picking from the results, suppressing unhelpful findings and promoting those that raised doubts about the link between tobacco and illness. They also used their political and economic power to push back against warning labels and other restrictions, using subtle media manipulation techniques that only came to light later. Researchers believe, for instance, that the industry funded AIDS research to divert attention away from a push for tobacco regulation.

After the doubt finally died, in the 1980s, when the industry was forced to acknowledge that smoking causes illness and death, the argument shifted to one of personal responsibility, freedom of choice — an argument Bosworth also wielded in his Axios interview. People have the right to consume misinformation, just as they have the right to smoke.

It may seem, at first blush, that it is a stretch to compare tobacco and social media executives. But we need to recall that much of what we know about Big Tobacco’s tactics only became evident many years later, after successful lawsuits pried open the companies’ archives.

Tobacco executives suppressed information that would have alerted the public to the health risks of smoking, no doubt because they saw it as their fiduciary duty to their shareholders. The same is true of oil companies, who started building climate-change assumptions into their internal planning processes while publicly casting doubt on the science and funding skeptics and astroturf organizations. 

Would social media platforms — who have better information about the impact of their new technologies than governments or academic researchers — suppress information about the social harms of their networks? The papers leaked by Facebook whistle-blower Frances Haugen suggest they have. 

Tobacco, oil and other industrial interests made this suppression a common cause for decades, relying on a coalition of scientific and political actors who were skeptical of research and regulation, many of them ideological cold warriors who saw the regulatory state as a threat to human freedom.

American historians Naomi Oreskes and Erik M. Conway do a good job of describing the long-running struggle in their 2010 book Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. One of the striking things about the book is the ferocity of the attacks on academic and scientific critics, as the merchants of doubt deployed political and economic clout to disparage academics who threatened their industries. 

I thought of that recently while reading Martin Patriquin’s in-depth portrait of Meta executive Kevin Chan for The Logic. Patriquin reveals that Chan, Meta’s senior global director and head of public policy for Canada, had used his position on a McGill University advisory board to attack the work of McGill professor (and CIGI fellow) Taylor Owen, an advocate for platform governance. 

Owen, who was alarmed to learn of Chan’s attack on his work, thinks policy makers should be aware of the influence Meta exerts on the spaces where policy is discussed.

“These things aren’t necessarily explicit but they shape the discourse,” he told me. “We’re seeing that in think tanks, in policy-oriented research centres at universities and in media organizations. There’s no question it’s happening. There’s no smoking gun, but that’s not how soft power works.”

Given the vital questions of public interest at stake, it is wise to at least keep in mind the patterns of behaviour established by the tobacco companies, as we watch Meta and the other tech giants remake our information ecosystems.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Stephen Maher is a Harvard Nieman Fellow and a contributing editor at Maclean’s.