Last week, a few major news outlets ran the story that Stuxnet had infected a Russian nuclear power plant as well as the International Space Station. The reporting was based on a speech given by Eugene Kaspersky, who leads the Russian anti-virus firm Kaspersky Lab. As NBC News points out, Kaspersky did reference Stuxnet infecting a Russian nuclear-facility network, but he did not claim that Stuxnet had infected the International Space Station. Instead he made references to the use of infected USBs on the Space Station.
As for the impacts on the Russian nuclear facility, NBC News notes that Kaspersky offered no details beyond saying that a “friend” of his who works in a “Russian nuclear-power plant, [using a network that] is disconnected from the Internet … [had once] sent a message [noting] their internal network is badly infected by Stuxnet.”
All of this got me to thinking about Eric Byres’ speech at The Automation Conference 2013. Byres is chief technology officer and vice president of engineering at Tofino Security. In his speech, he made it clear that industry’s reliance on air gaps (i.e., a lack of outside network connections) to keep their facilities safe is an illusion. The fact that even the International Space Station, with all its safety and security protocols—not to mention its extremely limited number of visitors—has been subject to infection via USB devices should clearly illustrate the folly in thinking that any manufacturing facility could be any more secure.
See Byres' industrial cybersecurity speech from The Automation Conference 2013.
I reached out to Eric to get his thoughts on Kaspersky’s speech and all the news it created, and the first thing he noted is that this story “makes it clear that the idea of an air gap to isolate critical systems from the rest of the world is a joke. If you can't control what USB sticks are coming onto your space launch vehicle (where every gram of material is accounted for), what hope do you have for controlling them coming into your factory?"
To help industry get a better grasp on the reality of cybersecurity, Byres said that trying to completely isolate a computer or control system is like trying to completely isolate a human being. "In order to live, humans need to breathe, eat and drink. To do that means that all sorts on nasty bugs get into our bodies. Similarly, in order for critical systems like the International Space Station (or the power grid) to operate effectively, they need to exchange data with some of the outside world," he said. "With that, malware will get into those critical systems. We have to accept the reality. Complete isolation is impossible and dangerous illusion."
Byres pointed out that the ultimate lesson to be learned from USB infections on the International Space Station is that “any system must be able to deal with a ‘pathogen’ if it gets in. For example, in the time it takes you to read this, your body has probably breathed in millions of viruses and bacteria. Yet you don't die a horrible death. Instead, your body quickly detects something bad has entered, quarantines it off and then works to neutralize it (or kill it). Each organ and system in your body is a zone with its own defenses. It is a coordinated multilayer defense-in-depth strategy.”
This same process should apply to the cyber defenses of your critical systems. “Reduce to a minimum the unnecessary and risky data flows in, but accept that bad stuff will get in,” Byres said. “Then when it does, have the technology in place to quickly detect the problem, control its spread and then neutralize it.”
Though the Stuxnet virus likely did not damage the Russian nuclear facility, due to the virus being designed to affect specific controllers in Iranian nuclear facilities, its re-emergence should serve as second wake-up call to industry. The threat of viruses, malware and breaches to our critical systems will be an ongoing concern that will likely never be categorized as something we no longer need to worry about—even when we think we have all the necessary safeguards in place.