The Dual-Use Challenge of Open-Source Intelligence

May 17, 2021

Years ago, I was at a workshop on society and technology hosted by Microsoft Research in New York City. It was a wonderful group of scholars and civil society leaders working on digital privacy, accountability and transparency. One evening during the event, a representative from Facebook gave a short introduction to a new feature the platform was about to launch. Graph Search, he explained, was based on an algorithm powering a semantic search engine that would allow Facebook users to dive deep into the “social graphs” of their network of friends, enabling them to search for people using detailed attributes collected from their activity on the platform. 

 The tool was immensely powerful. By drawing on the networked activity of Facebook’s then more than one billion users, Graph Search brought a new capability into online exploration. The workshop participants were taken aback by the casual nature in which the tool was being presented and immediately saw its risks. And sure enough, when it launched soon after, Graph Search almost immediately came under intense fire. It turned out that the same tool that was helpful for finding friends of friends with similar interests was also remarkably useful for phishing scams, online harassment and stalking.

During its short run before being pulled from Facebook, a keen user of Graph Search was Eliot Higgins. In 2011, Eliot Higgins was working an office job and reading a lot of news about the Arab Spring. In particular, he was closely following the flood of social media emerging from growing conflict in Libya. Frustrated with some of the coverage of the conflict, he began commenting on news sites such as The Guardian, pointing out mistakes, adding content and debating the news. Noticing a need for this kind of detailed analysis of online content coming from citizens living through the conflict, he started a blog under the alias Brown Moses. He would study hundreds of videos, and use online maps and a wide range of internet research to source munitions and track abuses.

Using these new methods, Higgins demonstrated that the Bashar al-Assad regime was using cluster munitions and chemical weapons. His investigative work was featured in leading global newspapers and led to a profile in The New Yorker in 2013 while he was still working his office job.

Higgins has since led a breathtaking array of online investigations, both as an individual and, starting in 2014, as part of an organization he founded called Bellingcat. The revelations in his work have made global news and shaken intelligence agencies around the world. He has proved that Malaysia Airlines Flight 17, a passenger flight, was shot down by the Russian government, revealed details of the Russian incursion into Ukraine, detailed extrajudicial killings in Mexico and by the Cameroon Armed Forces and the Ethiopian National Defense Force, identified the Russian military intelligence agents who poisoned Sergei and Yulia Skripal in London and, most recently, detailed the Russian Federal Security Service’s attempted assassination of Alexei Navalny.

In his new book, We Are Bellingcat: An Intelligence Agency for the People, Higgins describes the dual-use nature of tools such as Graph Search and his conflicted feelings about them. He says that Graph Search was “invaluable” in his work but also acknowledges the tension with wider privacy concerns. This tension is present both at the core of Bellingcat’s work and within a wider community of citizens practising what is known as OSINT, or open-source intelligence. Open-source investigation is, by its very nature, in tension with privacy concerns. Obviously, the more secure our data is, the harder it is for open-source investigators to do their work.

Speaking to Higgins for this week’s Big Tech podcast, I kept reflecting on what two of my previous guests might say about his work, and about the dual nature of the data he uses and the methods he has pioneered.

When I spoke to Nicole Perlroth, she detailed in vivid detail the market for zero-day exploits, which can be used by democratic and autocratic governments alike to hack into phones and to collect immensely personal and valuable data about our lives. And when I spoke to Ron Deibert, he explained the way hacking and spying tools are often produced in democratic countries and then bought and sold to criminal organizations and governments who use them to oppress activists and hunt down citizens, such as The Washington Post journalist Jamal Khashoggi, who was tracked and killed by the government of Saudi Arabia. Both Perlroth and Deibert make the essential point that these tools are fundamentally dual-use; you cannot absolve the nefarious uses with examples of their utility.

Which begs the question: for all of the incredible work that Bellingcat does, should we be concerned about the use of these same tactics and data for less noble pursuits? Bellingcat and many in the OSINT community have moved on from just searching YouTube and Google Maps, and often access a far wider range of data. For Bellingcat’s investigation into the poisoning of the Russian exile Sergei Skripal, they downloaded leaked Russian databases, which have passport information and residential addresses. With the black market for user data running amok, and zero-day exploits and spyware giving powerful hacking tools to the highest bidders, and with the whole space remaining astoundingly unregulated, it is worth asking who should have access to these tools and to this kind of data.

In We Are Bellingcat, Higgins writes that advancements in AI could make his job a lot easier. He says that “someday, we could ... train a computer to detect people in uniform, or those wearing specific military colours ... and compile the results for review by humans, including a network map of how everyone is connected, identifying high-status individuals whom we should study more closely. With this, we could monitor the actions of military forces suspected of wrongdoing.”

That sounds like a great idea, but it’s not hard to imagine how it might be co-opted by autocrats and used for state surveillance. Is that a trade-off we are willing to make?

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Taylor Owen is a CIGI senior fellow and the host of the Big Tech podcast. He is an expert on the governance of emerging technologies, journalism and media studies, and on the international relations of digital technology.