Shutterstock Image
Shutterstock Image

A recent media report from the United Kingdom (UK) highlights how the island nation is set to transform many of its aging rail line signal lights into a computer-based system. Effectively, as one BBC story notes, some of the busiest trains in the country will be turned over to computers by 2020. The downside of this turn towards computers is, as pointed out in this BBC article, it creates new vulnerabilities that can lead to previously impossible disasters. The software upon which the new system relies can be hacked, and, as an example, a system that is designed to tell the train to slow down to accommodate sharp turns or ending lines can be manipulated to tell the train to actually speed up, even as the program registers that everything is fine in the main control centre. This is similar to what happened with the Stuxnet attack on an Iranian nuclear facility. The centrifuges were told to spin faster and faster until they eventually broke, while, at the same time, the control systems were fed false information indicating that the system was operating correctly. These vulnerabilities — once impossible in a pre-digital age — are a part of the new reality.

On the other hand, when things do go right, greater reliance on computers and the automation of tasks create huge efficiency gains, driving up profits and making things generally safer. These profits are good for the operators of a newly computer-reliant system. The heightened day-to-day safety that computers can provide should make routine accidents far less likely in the future. But, it is important for companies that turn more of their operations over to computers to remember that these profits are not just new monies that can go directly to the bottom line. A portion of these newly freed up funds need to go to 1) training personnel on the basics (and the not so basics) of IT security and 2) making sure that employees are, within reasonable bounds, happy. Here is why.  

Most, if not all, IT security professionals will tell you that the weakest point in any computer system’s security is the individual user. This point was made, for example, at the Global Conference on CyberSpace 2015 by the head of Europol.

Europol head: "individuals are always the weak link in any security chain." #GCCS2015 #europol

— Eric Jardine (@ehljardine) April 16, 2015

Even seasoned employees can be tricked into clicking a link in a phishing e-mail that infects a network with a virus. For more sealed off (i.e., air-gapped) computer systems, an unwitting employee can plug a compromised USB key into the system and infect an otherwise self-contained network. This is, again, what likely happened with Stuxnet. Stuxnet was a sophisticated virus that was designed to load itself onto any system it came into contact with. It would then scan the device looking for the programmable logic controllers that governed the centrifuge systems that it was designed to break. If it did not find the right systems, it would eventually delete itself, leaving no trace of its presence. This trick allowed the program to spread widely and to go undetected almost until the very point that it was introduced via an infected thumb drive to the Iranian nuclear plant’s computer system, eventually destroying upwards of one fifth of Iran’s nuclear centrifuges.  In other words, computer viruses can be very cunningly designed and deliberately targeted. Often, as was the case with Stuxnet, people are the weak link in otherwise sound IT security systems.

In the case of newly computer-reliant systems, such as the UK’s proposed rail plan, all these points lead, as IT security expert Graham Cluley puts it, to the danger “that staff will either be deliberately and clandestinely assisting attackers or — most likely — make poor decisions, such as plugging in a device that is malware-infected that could expose the system's security.”

From a corporate point of view, the dual fact that a) the new systems have new vulnerabilities that are exploitable largely because of a company’s employees and b) the new systems create new profits suggests a rather straightforward solution. First, some of the monies that are freed up via digitization should be reinvested into training employees in IT security so that they are less likely to make poor decisions (i.e. clicking the link or plugging in the USB key) with serious consequences. Second, it is important for companies operating computer-reliant systems to pay their employees well and provide them with a positive work environment so they are happy and less likely to be malicious toward their company — and, by extension, ordinary citizens.

Obviously, no amount of training or salary bumps and vacation time will make a computer system completely safe. The new risks of computer-reliant systems are here to stay. However, when the turn towards computers touches on pieces of key national infrastructure, spending some of the newly realized economic gains on proper training and employee remuneration can do nothing but help.  

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.
  • Eric Jardine is a CIGI fellow and an assistant professor of political science at Virginia Tech. Eric researches the uses and abuses of the dark Web, measuring trends in cyber security, how people adapt to changing risk perceptions when using new security technologies, and the politics surrounding anonymity-granting technologies and encryption.