1 September 2002
The myth of obscurity
By Eric Byres
Ever since the tragic events last September, the process control industry has been trying to understand the risks we face from possible attacks on our chemical plants, water systems, and energy infrastructures. Could someone deliberately cause a chemical leak, poison the water supply, or shut down the power grid? If they did, would they launch the assault using physical means, or would a cyberattack be more likely?
While we don't have firm answers to these questions, we can't just ignore them and hope they will go away.
Unfortunately, I repeatedly hear control engineers and plant managers do just that. "Don't worry," they say. "Our DCSs and PLCs are safe because they require special knowledge, and no hacker or terrorist has that information."
Even the IT world has bought into this "security through obscurity" myth. A recent article in CIO Magazine entitled "Debunking the Threat to Water Utilities" stated: "Most public utilities rely on a highly customized Scada [sic] system. No two are the same, so hacking them requires specific knowledge."
"I am both a controls engineer and former commercial pilot, and let me tell you, learning to program a PLC is a hell of a lot easier than learning to fly a 747."
The tools are easy to come by, too. Simply download a free demo of the programming software off the Web site, and try a bit of ladder programming. As Kevin Driscoll, senior staff scientist at the Honeywell Technology Center, stated last month at a security conference, "I am both a controls engineer and former commercial pilot, and let me tell you, learning to program a PLC is a hell of a lot easier than learning to fly a 747."
It is very clear that hackers are learning about PLCs and DCSs. There have been several well-publicized attacks on control systems during the past year, the most famous being the SCADA system attack and subsequence sewage spill in Queensland, Australia. In another case, just described at the ISA Industrial Security Conference in Philadelphia, someone hacked into a PLC in a semiconductor manufacturing plant, shutting down a reverse osmosis system.
These attacks should not be surprising. If defacing a Web site is a hacker's badge of honor, imagine how much more exciting it would be to turn off the lights in Los Angeles.
Not that any hacker would need to learn about industrial controllers in order to wreak havoc on them. Research at the BCIT Internet Engineering Lab showed that several major brands of controllers can be adversely impacted by tools in the average teenage "script kiddy" tool kit.
For example, scanning for and exploiting security flaws in the simple network management protocol (SNMP) is standard hacker practice, and as an audience member at the recent ISA Industrial Security Conference in Philadelphia angrily pointed out, a leading brand of PLC uses SNMP but won't let users change passwords or disable service, making the PLC an easy target for network attack.
Inside our walls
So are gangs of al Qaeda terrorists huddled in caves studying up on the latest PLC programming methods as you read this? Maybe, but I think that our biggest risks are homegrown. Remember that prior to 11 September, the biggest loss of life from a terrorist attack on U.S. soil was the Oklahoma City bombing, where 168 people died due to the actions of two local nutcases.
Most telling, FBI statistics indicate that 70% of all cyberattacks are from insiders. This matches similar results from the BCIT Industrial Security Incident database, which indicates that the majority of security problems are created by either current or former employees.
So if we are going to have a serious cyberincident involving a process control system, most likely it will be from the inside—people who are well trained on the programming and operation of our control systems. They may not fit our storybook image of a hacker, but they can still do just as much harm.
It is clear that the risks of hacking to our process systems are very real, even if the definition is not precise. We don't need to panic, but we do need to invest time and resources into assessing the cybersecurity of critical process control systems and understanding how attacks might occur.
These risk analyses can follow methodologies of the typical safety analysis that are common practice in the process industries, except they need to consider not only the possibility of accidental events but also sequences of events that are deliberate and malicious.
Once we understand these risks, we can begin to secure our control systems in an engineered manner and reduce any vulnerabilities from attack to acceptable levels. To do otherwise is to leave our heads in the sand and our factories vulnerable to terrorists and teenagers on joyrides. IT
Eric J. Byres, P.E., is a faculty member and research manager of the Internet Engineering Lab at the British Columbia Institute of Technology. He currently holds the Advanced Systems Institute Fellowship for analysis of industrial network security issues.
Return to Previous Page