Societies function on the basis of trust, and Internet users are no different. Users need to trust the Internet. They need to trust it to keep their data secure, protected and private, and they need to trust it to reliably give them the content they want to view and share. If governed in an inclusive way, users will continue to place their trust in the internet. But the reality is that a large crisis is looming. The world is losing faith in the Internet.

Simply put, trust is the social bedrock of the Internet ecosystem. The technology matters, of course, but the glue of trust is what keeps the system together and running. So how did we get here and why does the survival of the Internet depend almost entirely on trust? If you ask the citizens of 23 countries, at a rate of around 24,000 people each year, they will tell you that they are losing trust in the Internet or fear going online. When the trust of citizens in the Internet ecosystem starts to wane, they change how they behave online.

The case of government surveillance is a clear example. We asked people in 2014 whether they had ever heard of Edward Snowden and found that about 60 percent of respondents had heard of the now notorious NSA whistleblower. Following this question, we asked respondents who knew of Snowden to gauge how many people had taken steps, because of what was revealed, to better protect their online privacy and security.

Here, we found that 39 percent of people who had heard of Snowden reported changing their behavior online. In absolute terms, this means that around 700 million people reported changing their behavior due to the perceived violation of their online privacy that resulted from NSA surveillance. That says quite a bit about the level of trust that users are placing in the internet.

Of course, declared behavior changes like those recorded in our surveys might differ from actual alterations to people’s actions and habits. It is easy to say one thing and then routinely do another. To compensate for this, we looked at usage statistics for private search engines like Duck Duck Go and anonymity technologies, such as Tor. In both cases, when tracking pre-Snowden usage levels and comparing them to what happened after The Guardian began publishing stories of Snowden’s actions in June 2013, we found that many people’s routine of activity online had changed substantially. Large swaths of users switched from, say, Google Chrome or Internet Explorer to Duck Duck Go or Tor.

When it came to users shifting their online activity to Tor, the results were equally compelling. As an anonymity network, Tor is an essential tool in the dissident’s toolkit, and people in repressive regimes really do rely upon the technology, even after the effect of education, income and a variety of other factors are taken into account.

This tells us quite clearly, that after Snowden, users’ trust in the internet was so low that millions around the world flocked towards the anonymity of Tor. In doing so, users chose to use a network that is best known for its widespread use by criminals who aim to buy and sell illegal goods and services and view or share child abuse imagery.

An interesting point that emerges from this example is that law enforcement can and does effectively police this realm. But in general, their job gets harder the more users there are upon the network. So what happens when user trust in the internet is totally lost and users all run towards the Dark Net? Herein lies one of the many problems when people change their behavior because of a loss of trust.

When a bunch of people from around the world flooded to Tor after June 2013 and contributed to an increase in the monthly usage rate of upwards of 284%, the ability of law enforcement to use de-anonymization tactics like traffic correlation became just that much harder. Instead—and certainly a growth in users is not the only reason why law enforcement has made this turn—you have agencies like the FBI policing Dark Web hidden services through network investigative techniques, such as hacking vulnerabilities in the Firefox browser to pinpoint the identities of those who use Freedom Hosting services.

But the point, really, is that trust matters. People change their behavior when they think the network is less trustworthy. And the effects are certainly not isolated to government snooping. Via the two surveys and a variety of other data sources, we found evidence to support the idea that as many as a third of people now self-censor what they say online and that upwards of 380 million people use online platforms less due to corporate surveillance practices. Like when everyone stands up at a hockey game, these en masse collective changes can make us all far worse off. They can also feedback and undermine trust further in some cases, like the move towards data localization requirements.

Fixing the growing trust problem is not an easy task. People don’t always react in ways policymakers (broadly defined) would expect or predict. Sometimes small tweaks to government policy can produce substantial effects. Sometimes small tweaks to the code of an operating system can radically change the perceived reliability or privacy of the system, as well as affect the platforms and applications users gravitate towards.

We began this project by examining the seed of trust and how it has fit into the larger Internet ecosystem. Our findings fit neatly into three core ideas that we would like to leave you with:

  1. If the growing trust deficit is not remedied or at least abated, the Internet’s potential as a platform for commerce, innovation, free expression and scientific collaboration will be harmed, potentially irrevocably so.
  2. There is no magic bullet. Trust can be restored, but it will take the carefully planned and executed interaction of changes to norms, technology, and policy and institutions.
  3. A single-pronged solution won’t work. Simply grabbing one lever, say, technology, and trying to fix everything in one broad stroke will fail and will most likely even be counterproductive.

Like all tools, the Internet can be either a sharp or a dull instrument of human creativity and expression. Without trust, however, it can and very likely will end up a broken, useless tool.


This article originally appeared in the Tripwire State of Security blog.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.
  • Eric Jardine is a CIGI fellow and an assistant professor of political science at Virginia Tech. Eric researches the uses and abuses of the dark Web, measuring trends in cyber security, how people adapt to changing risk perceptions when using new security technologies, and the politics surrounding anonymity-granting technologies and encryption.