1 December 2001
Assessing the e-threat to manufacturing
BY BOB FELTON
Are fears that include power outages and poisoned pharmaceuticals justified?
Though U-boat warfare had bankrupted England and driven the country to the brink of mass starvation, and though much of London had been ruined by nightly bombings, Prime Minister Winston Churchill refused to concede defeat. He instead took to the airwaves, vowing in a worldwide radio address to destroy Nazism. Churchill finished with a plea crafted to appeal to the "can do" spirit of America: "Give us the tools," he promised, "and we will finish the job."
It isn't liberal ideals that secure freedom and safety; it's the tools and resolution at their service.
Now, as we enter the era of always looking over our shoulders, experts' voices seem a bit louder when they ask: "Could electronic terrorists take down America's production lines, frustrate the nation's ability to defend itself, stop production of the millions of tools that keep the world's leading economy moving?"
"Tomorrow's terrorist may be able to do more damage with a keyboard than with a bomb," read a 1991 report prepared by the National Research Council. Ten years later, American industry's reliance on an ever-more-complex skein of electronic communications is even greater.
There's no shortage of garish theories about the sorts of attacks terrorists might launch. One particularly prolific author and speaker of the '90s, Barry Collin of the now-defunct Institute for Security & Intelligence, speculated back then that electronic invaders might surreptitiously fiddle with the controls in a cereal factory, upping the iron content and poisoning America's children. They'd whack a lot of others, he suggested, by reprogramming pharmaceutical plants to produce poisons. Today, out of the electronic security business and working in films, Collin said, "I'm not as worried now as I was then."
But many experts, mindful of the trend toward open communication standards, remain wary. Increasingly, water treatment plants, pipelines, transportation systems, financial networks, and plant floors communicate using the same set of protocols—and the slow disappearance of proprietary systems eliminates the safety of obscurity. It might not pay, for instance, for terrorists to study the intricacies of the WhizBango-V control system in order to shut down Hometown Widgets, but when the company upgrades to Windows or Unix/Linux and open protocols, their risk goes way up automatically because those systems' weaknesses are well known.
"As these systems are modernized," said Dennis McGrath, a senior research engineer with Dartmouth University's Institute for Security Technology Studies, "you get a reduction in cyberdiversity, which inherently introduces vulnerabilities." Even so, McGrath added, "We're not advocating that people become dinosaur keepers."
It is possible, he said, for a knowledgeable attacker familiar with the workings of a particular system to send false information to a control, though "it's very hard to draw up a scenario, at least in the short term, where somebody breaks into a system and there are mass casualties as a direct consequence."
Though experts seem almost universally agreed that it's theoretically possible in some cases for an attacker to reach a plant floor and start flipping switches via the computer network, few think that's a great danger. Collin's cereal and pharmaceutical scenarios, for example, drew hoots from academia and the FBI. Mark Pollitt, with the FBI Laboratory's Computer Analysis Response Team, wrote a paper characterizing them as having "a number of fallacies."
Regarding a hypothetical cereal plant attack, Pollitt said, the amount of iron that would have to be added to cereal to make it toxic would rapidly deplete plant stores; somebody would be curious. Second, routine quality control would detect the problem. Third, it would taste so horrible, nobody would eat it.
Similar objections prevail for most of the other much-discussed nightmare schemes, and all rely on just a handful of commonsense observations: First, virtually all production systems issue warnings when materials fall outside their acceptable range. Second, production systems do not exist apart from the universe of human operators and users.
"Computers do not, at present, control sufficient physical processes without human intervention to pose a significant risk of terrorism in the classic sense," Pollitt wrote. Nor should that change, he added. "As we build more and more technology into our civilization, we must ensure that there is sufficient human oversight and intervention to safeguard those whom the technology serves."
Dick Morley, inventor of the programmable logic controller, is similarly dismissive, though for slightly different reasons. Modern production facilities, he said, are too complex to easily reconfigure to create harmful products or permanently disable a manufacturing plant. Change something here, and you've got to change a dozen more things downstream in order to pull it off. "Extremely unlikely," he said.
A report issued in 1999 by the Naval Postgraduate School took up the subject of cyberterrorism and concluded, like Morley, that the entry barriers to serious damage are, just now, too great.
Dorothy Denning, a Georgetown University computer scientist, pointed out an additional problem in testimony she gave before Congress: "The study also determined that hacker groups are psychologically and organizationally ill suited to cyberterrorism and that it would be against their interest to cause mass disruption of the information infrastructure.
"Thus, at this time," she continued, "cyberterrorism does not seem to pose an imminent threat. This could change. . . . Unless people are injured, there is also less drama and emotional appeal. Further, terrorists may be disinclined to try new methods unless they see their old ones as inadequate, particularly when the new methods require considerable knowledge and skill to use effectively. . . . For now, the truck bomb poses a much greater threat than the logic bomb."
BEST DEFENSE: TRANSPARENCY?Though cyberterrorism may pose little threat of real harm at the level of the plant floor, nobody doubts the potential for havoc one level up: the networks that tie everything together. Here, ironically, the movement toward common operating systems and open standards might work to mitigate some of the dangers.
Though Windows' weaknesses are well known, for instance, and newly discovered entry points are broadcast worldwide within minutes, its widespread use assures a strong, global commitment to plugging its leaks. That might not be true of a proprietary system.
The National Institute for Standards and Technology's Manufacturing Engineering Laboratory is bringing similar reasoning to bear with an initiative that aims to eliminate proprietary systems on the plant floor, replacing them with open standards based on Internet communications protocols. Not only will software applications and organizations find it easier to communicate with one another, but disinfecting a hacked system and restoring it to operating condition should also be simplified by the availability of freely distributed, well-documented systems. IT
Behind the byline
Bob Felton is an InTech technical editor.