Disinformation Is Undermining Democracy in West Africa

While the problem is not new, the current phase is particularly challenging, given the scope of the manipulation.

July 4, 2022
africademocracy
Demonstrators rallied May 28, 2021, at the Place de l’Indépendance in Bamako, Mali, in support of a coup by Mali’s military and in favour of greater cooperation with Russia. (Nicholas Remene/Cover Images via REUTERS)

Information disorder in West Africa is threatening the social fabric of multi-ethnic societies across this region. While the problem is not new, the current phase is particularly challenging, given the scope of the manipulation, the ease with which information can be shared, the multiplicity of techniques adopted to do so and the proliferation of actors — individuals, state actors, foreign governments and specialist firms.

Adding to this deadly mix is the fact that global tech platforms have a limited regulatory presence in West Africa, with priority placed on politically influential countries such as Nigeria, Ghana, Senegal and Côte d’Ivoire. Even in these key markets, the platforms are failing to effectively moderate content or implement their own standards. For now, Nigeria gets more attention than Cabo Verde, while Brazil receives significantly more attention than Nigeria. This hierarchy of enforcement is the major obstacle that must be addressed.

Understanding Information Flows

In 2021, 17.3 percent of the population of West Africa had direct access to the internet, with countries such as Nigeria and Ghana leading the way at 51 percent and 53 percent, respectively. But these figures don’t tell the complete stories of the influence and importance of social media in the wider information ecosystem.

Social media content is not confined online. Recognizing and understanding the ways in which it spreads, infiltrates and influences conventional media and word of mouth is vital to an improved understanding of information ecosystems across the subregion. In West Africa, social media can act as a “pavement radio,” with members of communities gathering round a single phone to consume content together. Posts circulate across social media platforms too, with a tweet, for example, likely to be found circulating in WhatsApp groups, noted in a restaurant conversation or becoming a topic for discussion on television or radio.

In short, the increasingly blurred overlap between traditional media, social media and word of mouth means that content moves between online and offline spaces in ways that are not always easy to track but are critical to recognize, given how they influence and reinforce each other. Within that information ecosystem, fake news — often centred on ethnic and religious identity — has been influential in shaping narratives in West Africa. It has been used to delegitimize political opponents, cement authoritarianism and, more recently, discourage vaccine uptake. One impact of this has been to reduce trust in government and democracy itself.

In West Africa, several actors both online and offline typically create and disseminate fake news. These can include individuals, the state, foreign actors, diaspora communities, the media, specialist consultancy firms, online influencers and automated bot networks. Some do it for financial gain; some, for political influence; still others are part of wider efforts to maintain authoritarian systems.

These actors are supported by a wide array of approaches and tools:

  • Computational propaganda, and the automation of content, is a growing feature of Africa’s online disinformation industry. Botnets, groups of bots and coordinated groups of trolls — called troll farms — promote specific narratives and are deployed to generate online conversations and get stories trending.
  • Astroturfing involves unsolicited comments on social media networks, often posted by political consultants.
  • Micro-targeting is a strategy that uses consumer data to create and target specific geopolitical locations and interests, biases and religious groups.
  • Deepfakes are videos or audio recordings that have been digitally altered or fabricated using artificial intelligence technology that is becoming increasingly sophisticated.
  • Shortened URLs are used by disinformation pedlars to trick users into thinking a website link is real and to spread disinformation from fake sites.
  • Doctored chyrons — chyrons being the electronically generated captions or images superimposed on a screen, for example, a TV station’s logo or headline during a news broadcast — can be used to misrepresent pictures and videos.
  • Masking online identity involves obscuring one’s online identity through the use of proxy and troll accounts, usually in order to anonymously spread disinformation.
  • Manipulated audio, pictures and videos involve the digital alteration of social media content to fit a specific agenda.
  • Modification of news content is a method whereby content stolen from legitimate media is modified to push a particular narrative. For example, headlines may be altered, or they may remain the same but the content beneath them may be sufficiently changed to affect the thrust of the argument made.

Proponents of authoritarianism also push the narrative that the inability of democratic systems to address challenges and achieve consensus — due to their participatory structures — stymie wider development progress.

How Information Disorder Affects West African Democracy

Information disorder is enabling authoritarian regimes in West Africa to set and control a narrative that reinforces their governance approach, sows confusion and undermines democratic movements. Promoters and members of authoritarian regimes work to glorify the merits of illiberal governments in key areas of governance and development. Following the recent coup d’états in Mali, Guinea and Burkina Faso, for example, narratives are rife across the region about how democracy is failing to deliver, especially in conflict situations and in bringing development.

Proponents of authoritarianism also push the narrative that the inability of democratic systems to address challenges and achieve consensus — due to their participatory structures — stymie wider development progress. Pushing the narratives of Russia’s and China’s effectiveness, proponents argue that these countries’ models have enabled them to develop militarily and economically.

Information disorder is also being weaponized by authoritarian governments beyond the region to advance regime continuity through the disruption of credible flows of information and the deliberate spread of disinformation. For instance, Russian influence operations in West and Central Africa have been documented as presenting Russia as a saviour and the West as the enemy. The flooding of social media with messages blaming France and the West for fuelling Islamic insurgency in Africa is just one example. Russia is also funding fake news websites, pro-Russia television stations and advertisements online and recently sponsored a beauty pageant in the Central African Republic (CAR). Russia also appears to have made efforts to recruit local political consultants who show up on media programs to speak positively of the role Russia is playing in the region. Russian influence on ordinary residents is shown by the numbers of people bearing Russian flags and banners or wearing Russia-themed shirts during protests aimed at France in Mali and Burkina Faso.

Iran, too, has run targeted disinformation campaigns in West Africa, including in Nigeria, CAR and Senegal. In Nigeria, Iran used proxy social media accounts to spread propaganda in support of Iranian Supreme Leader Ayatollah Ali Khamenei and against the West in October 2020. Other Gulf countries such as Saudi Arabia, Turkey and Qatar are also suspected of conducting influence operations in West Africa.

But it isn’t just authoritarian powers that are exerting digital influence in the subregion. In West Africa, France has also been caught spreading disinformation using social media. In December 2020, Facebook announced the removal of a French network of 84 Facebook accounts, six pages, nine groups and 14 Instagram accounts for “coordinated inauthentic behavior” targeted at Mali and CAR that sought to shape the narrative around French policy and the security situation in the country.

Threatening Peace and Security

In the area of peace and security, information disorder is pushing citizens into polarized echo chambers that are further breaking down the social fabric and fuelling hostility and violence. False information online in Nigeria, Mali and Burkina Faso has led to outbreaks of violence along religious and ethnic lines. Fake news does have significant real-world implications as it builds on existing narratives and social cleavages. In 2018, the BBC described how a riot that led to the death of 11 people — mainly Fulani Muslims in Plateau State, Nigeria — was ignited by Facebook images showing mutilated bodies, burned homes and murdered children. But the images had originated from the Democratic Republic of the Congo and were from 2012.

Armed non-state actors are also exploiting the disinformation ecosystem to recruit, expand and organize in ways that undermine democracy. In northeast Nigeria, the Islamic State in West Africa Province has created propaganda to promote itself as a credible alternative to the Nigerian state, which it portrays through videos and images as an aggressor. Similarly, al-Qaeda affiliate Jama’at Nasr al-Islam wal Muslimin, which stands opposed to France, uses social media and messaging technologies to frame itself as the “good guy,” fighting off foreign oppression in Mali.

Disrupting Elections

Disinformation is becoming a factor shaping election processes and outcomes in West Africa. Political actors and supporters of political parties use cloned, fake news websites and cyber “warriors” — also referred to as propaganda secretaries — to market party candidates and boost voter turnout in the party’s favour. Disinformation is also used to confuse voters by overwhelming them with vast amounts of conflicting information. There is also evidence of fake news being used to suppress votes by reducing turnout during elections. In Côte d’Ivoire during its 2020 presidential elections, fake information circulated and led people to return to their village in fear of attacks or violence, leading many not to vote.

Foreign interference in elections has also been a notable and growing problem. In the prelude to the 2020 Ivorian presidential elections, pro-Russia operatives spread disinformation on social media with the intention of defaming political opponents of the ruling party. In October 2019, Facebook removed three networks of accounts, pages and groups for engaging in foreign interference on the platform. But the ability to capture and document these growing threats to a core tenet of democracy remains in its infancy. Key election stakeholders are not yet armed to tackle disinformation, and observation missions remain largely focused on traditional areas of coverage, such as election observation. There is insufficient recognition of the ways in which technology and information disorder can impact the campaign and the voting process.

Disinformation also impinges on gender equality. Women politicians and candidates are increasingly faced with online campaigns that attack them for their role in public life.

Undermining Gender Equality

Disinformation also impinges on gender equality. Women politicians and candidates are increasingly faced with online campaigns that attack them for their role in public life. It is common to see aspersions cast on individuals’ integrity and capacity, in efforts to prevent them from participating in politics. In several instances nude pictures of female politicians have been shared online. For example, in Cabo Verde, a former leader of the opposition party was portrayed in a faked photo montage in a pornographic setting. Even though it was a doctored image and was condemned by the president as false, no one was ever held accountable for posting it.

State Responses to the Disinformation Threat

At the same time, there is a strong risk of overreach in African governments’ efforts to counter disinformation. In some cases, governments’ actions are impinging on fundamental human rights. Countries that have either promulgated or amended existing laws around the spread of online falsehoods, such as Nigeria and Guinea, have sent delegations to better understand China’s internet firewall. In 2019, Nigerian legislators sought to pass the Protection from Internet Falsehood and Manipulation Bill. This law would have allowed Nigeria’s government to cut off internet access or block specific social media platforms such as WhatsApp, Facebook and Twitter at its own discretion. They are not alone in using the threat of disinformation to introduce laws that silence online dissent. Guinea, Mali, Niger, Nigeria, Sierra Leone and Togo have all introduced cybercrimes or cybersecurity laws since 2015.

Another way of controlling the narrative is to close the space entirely. Internet shutdowns are used increasingly by states to clamp down on alternative narratives, even though evidence suggests this disrupts economies, violates human rights, endangers livelihoods and shields governments from legitimate scrutiny and criticism. Twelve of the 15 Economic Community of West African States countries — with only Ghana, Cabo Verde and Guinea-Bissau exempted — have in the past decade experienced some form of internet or platform shutdown lasting from hours to months. But this approach offers a temporary solution only, and comes with significant economic implications.

Tackling Falsehoods

Disinformation can be used to shrink the civic space, delegitimize institutions and personalities, and suppress democratic voices. The growing difficulty in sorting facts from fiction is impacting citizens’ trust in democracy. But information disorder cannot be disconnected from a wider democratic decline. It manifests in different ways due to a lack of access to information, poor economic circumstances and clampdowns on rights, as well as weak and pliable institutions. This suggests that dealing with this problem should not focus solely on laws, technology solutions and digital literacy; it must also address core tenets of democratic development. As is outlined below, a more holistic approach is needed:

  • Greater research: More studies to understand what is happening across information ecosystems can ensure that research keeps up with disinformation techniques, which continue to evolve.
  • Establishing global standards rooted in local nuances: There are already discussions about setting global rules across social media platforms to moderate content. However, any moderating standard should not just adhere to the local ethos and values but be rooted in local nuances.
  • Sorting fact from fiction: Fact-checking must be grounded and interactive to fully engage audiences and to reiterate the severity of the problems that can accrue from online falsehoods. Communicating findings in local languages and accessible formats is key in this regard.
  • Support for quality journalism: Financial and technical support can reduce media reliance on political benefactors, improve credibility and support the generation of high-quality, well-researched content.
  • Civic literacy: There is an urgent need to empower people to consume content critically. This should entail educating every citizen on the basics of fact-checking. In addition, efforts are required to support greater critical awareness among the public of how social interactions and relationships influence our decisions regarding what to share or like, which in turn contributes to the circulation and visibility of news in the wider media environment.
  • Regulating big tech: The missing element in West Africa’s response to information disorder so far has been platform regulation.

Most of the global social media platforms do not have a presence in West Africa and, where they do, it is limited, with priority given to larger and more geopolitically “important” countries. States can do more to engage big tech companies. But this needs to be done in a way that creates some degree of impartiality, as governments themselves are extensively involved in the influence industry. Improving responsiveness when users highlight abuse or flag false or hateful content must become a priority for technology platforms, such as Facebook, that have large user bases in West Africa.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Idayat Hassan is a lawyer, development expert and director of the Centre for Democracy and Development in Abuja, Nigeria.