- By Bill Lydon
- Cover Story
- Cybersecurity threats to manufacturing and process plants are coming from a widening range of attack vectors.
- The digital transformation of manufacturing to increase efficiency and productivity will increase the number of connected IIoT devices and the cybersecurity attack surface.
- The digital transformation requires integrating control and automation, which can create many new vulnerabilities that have to be addressed and mitigated.
By Bill Lydon
Cybersecurity threats to manufacturing and process plants are coming from a wide range of attack vectors, including supply chain, logistics, enterprise computing, remote connections, operator stations, programmable logic controllers, distributed control systems (DCSs), smart sensors, and new smart devices. Many emerging Internet of Things (IoT) and communications technologies offer greater connectivity, but they make the cyber landscape more complex. This article explores aspects of these issues with cybersecurity experts Andy Kling, senior director of cybersecurity and architecture at Schneider Electric, and Marty Edwards, director of strategic initiatives at ISA and managing director of the Automation Federation (AF). I asked these recognized industry experts for their thoughts and opinions on a number of items:
IoT communications technology not only follows traditional information technology (IT) routes, but also connects to process, machines, material handling, and factory floor devices to close the IT/OT divide.
How do you characterize new challenges created by direct edge-to-enterprise communications, and do you have any advice for users?
Kling: Without a doubt, the digital transformation will increase the number of connected IIoT [Industrial Internet of Things] devices. Enterprise business is constantly seeking new, better ways to get closer to operations. This is generally a very good thing, because as the speed of business accelerates, you need to be able to control your business variables and risks in real time. The natural result of all this new connectivity is a wider attack surface. To take advantage of the value the IIoT promises, organizations must expand connectivity amongst people, assets, and systems, which allows them to extract and make data work for them. To protect these new connections, you first need to understand the risks associated with moving to an IIoT environment: Will all the new information and data from the edge provide business benefits that exceed the risks it takes to retrieve and apply it? It is a simple question, and if you cannot answer, it is likely because you do not yet know and understand the full risk landscape. So get expert advice. Once you determine the value is there, then focus on data integrity. One compromised input device can poison the data repository. Cybersecurity can no longer be an afterthought. There is too much at stake, financially and operationally.
Edwards: There are a number of issues to consider here, I think. Firstly, just due to the sheer volume of new devices coming on the market, which might mean having to work with new manufacturers and vendors, we should expect to have a bunch of new vulnerabilities that will need to be addressed and mitigated. End users will have to contend with that and understand the implications and risks these new vulnerabilities will have on their operations.
It then depends on how you bring in the data. If your application vendor is backhauling all of the device data into the cloud through a service provider network like 5G and all you are getting is the data through the same vendor, then "device security" really becomes a vendor problem. At the opposite end of the spectrum is that if all of these devices are interconnected to your own control networks, then you really need to take a look at bringing the data in via a protected enclave, i.e., a section of an internal network that is subdivided from the rest of the network, much like we do with roaming Wi-Fi-enabled operator-interface solutions, for example.
A digital transformation requires increased connectivity and data transference, and 5G wireless can satisfy this demand. Process automation systems today primarily rely on hardwired networks for communications, particularly Ethernet, but to achieve the goals of new digital initiatives, like Industry 4.0 and IIoT, there are increasing bandwidth requirements. In addition to plant automation, 5G wireless capabilities are suited for linking process sensors and instruments to business enterprise systems.
Previous generations of mobile networks predominantly addressed consumers for voice and SMS in 2G, Web browsing in 3G, and higher-speed data and video streaming in 4G. The transition from 4G to 5G will better serve consumers and industries alike. New 5G wireless technologies provide the network characteristics manufacturing requires, including high bandwidth, connection density, low latency, and high reliability to support critical applications. Mobile 5G technology will allow higher flexibility, lower cost, and shorter lead times for factory floor production reconfiguration, layout changes, and alterations.
It is not necessary to wait for commercial wireless carriers to implement 5G before manufacturers can take advantage of these benefits. Production plants are already implementing 5G for in-house communications.
A number of 5G industrial applications were demonstrated at the 2018 Hannover Fair, including an extremely impressive concept of deterministic, high-speed coordinated motion over 5G wireless communications. The 2019 Hannover Fair will have multiple pavilions showcasing 5G in manufacturing applications and educational sessions on the topic.
How do you characterize new cybersecurity challenges created by 5G, and do you have any advice for users?
Kling: When it comes to industrial operations, 5G feeds the IoT beast. In part, the definition of IoT is a connected device. With the increased bandwidth and security of coming 5G networks, there is a promise of many new vertical solutions. As a result, availability (resist jamming), integrity (protect from signal corruption and man-in-the-middle replays), and confidentiality all bear a heightened importance. 5G will be in places not really thought of previously. Yes, the SCADA [supervisory control and data acquisition] pipeline examples already exist, but imagine 5G-enabled drones running continuous thermal imaging of a plant. They could quickly isolate problems that would have been difficult to locate previously.
Uniquely, everybody has access to this transport layer. In 3G/4G technologies, jamming or "smart" jamming was always a concern. 5G has attacked this problem, making the wireless standard more resilient to jamming. The bottom line is that when it comes to 5G-similarly to the IoT ramp up-5G will enable many new solutions. Like IoT, its use must be tempered with an appropriate understanding of the risks involved. Our challenge will be to use it securely and appropriately.
Edwards: I pretty much agree with what Andy has said. I hadn't thought of a lot of those things, but from a security perspective, I view 5G as just another transport layer. If the 5G vendors "get it right" with security and bake it into the implementations from the beginning, then it will be less of an issue for the end user.
Cloud computing using third-party off-site providers is growing in popularity as a technology beneficial for industrial automation. The origin of the term "cloud computing" is unclear. In some sense, it is descriptive of something off in the distance over the Internet. We are just not sure where or what is storing information and performing computing. Some claim the term was used in internal documents at Compaq Computer in 1996. Others suggest the term was first commercially used in 2006 when Google and Amazon began using "cloud computing" to describe the new approach to access software, computer power, and files over the Web instead of from local servers or a desktop computer.
Whatever the history, cloud computing, or "cloud" for short, is now a common term. Pictures of local computers networked to the image of a cloud in presentations and literature have become popularized. The National Institute of Standards and Technology (NIST) defines cloud computing as "a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction." NIST also defines essential characteristics of cloud computing:
A user can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each service provider.
Broad network access
Capabilities are available over the network and accessed through standard mechanisms (i.e., Web services) that promote use by various platforms (e.g., mobile phones, tablets, laptops, and workstations).
The cloud provider's computing resources are pooled to serve multiple consumers using a multitenant model, with different physical and virtual resources dynamically assigned and reassigned according to user demand. There is a sense of location independence in that the customer generally has no control over or knowledge of the exact location of the provided resources. Examples of resources include storage, processing, memory, and network bandwidth.
Capabilities can be automatically provisioned and scaled to rapidly meet computing and storage needs based on user demand. To the user, the capabilities available often appear to be unlimited and can be appropriated in any quantity at any time.
Cloud systems automatically control and optimize resource use by leveraging a metering capability at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, applications, and active user accounts). Resource use can be monitored, controlled, and reported, providing transparency for both the provider and consumer of the utilized service.
The application of cloud computing has the potential to change industrial automation system architectures that have been traditionally on-site systems requiring capital investment to add functions. In contrast, the cloud computing model provides significant storage, computing, and application software on demand with engineers only paying for what they use. Cloud computing and on-demand analytics are also being developed for a much broader range of applications outside of industrial automation, providing powerful and more cost-effective new capabilities for automation engineers.
How do you characterize new cybersecurity challenges created by cloud services used in industrial automation applications, and do you have any advice for users?
Kling: The cloud brings so much promise, it is hard to ignore its potential. Several supporting technologies-improved communications, virtualization, improved compute power-make the cloud happen. All of this supporting infrastructure brings, in turn, its own set of cybersecurity challenges. So, it is not enough to just ask about cloud security-related challenges. You must also be aware of all that surrounds the use of the cloud. Each of these technological elements brings new challenges.
Edwards: Love it or hate it, the cloud is here to stay. There has been some fear, I think, in the adoption of cloud techniques in the control system space and rightly so. We need to look at significant change in our systems design very carefully. As far as cybersecurity, you can get the best and the worst. In some ways cloud services let you bundle security services onto something in a very easy way, but like anything, if you don't configure your containers and the like correctly, you can introduce security issues very easily.
Here's an interesting story to prove the point. I saw a wide geographic SCADA application in which the user had moved their entire SCADA environment-front end servers, comms, etc.-into the cloud, which surprised me. Their rationale, which was very well thought out, was that the uptime of the cloud provider's infrastructure was guaranteed contractually to be much higher than what they could accomplish with their own infrastructure. By the time they looked at maintaining the communications and all the servers, the cloud implementation looked very attractive and has performed very well for them.
Fog computing, also known as fog networking or fogging, is a decentralized computing infrastructure in which data, compute, storage, and applications are distributed in the most logical, efficient place between the data source and the cloud. Fog computing essentially extends cloud computing and services to the edge of the network, bringing the advantages and power of the cloud closer to where data is created and acted upon.
How do you characterize new cybersecurity challenges created by fog computing, and do you have any advice for users?
Kling: Similar to cloud, fog brings a unique set of security challenges. Applications become virtualized, which makes them fluid. That means they can move east to west (device to device) and north (to cloud) or south (closer to the cyber/physical edge). As a result, security features must "follow" the application. For example, certificates that might traditionally be stored in hardware will have to find a way to become more fluid as virtualized applications move between platforms.
Fog bridges the gap between edge computing and cloud computing and comes with its own unique challenges. Your first step is to understand those challenges: It demands virtualization, and it is often required to be more performant to run IACS [industrial automation control system] solutions than traditional cloud technology.
Make sure that, as a bridging technology, your security solution does not fall to just the lowest common denominator between the layers but maintains a robust set of security features unique to large amounts of virtualization.
Edwards: Fog is essentially an "on-premise cloud infrastructure." If that is the case, then end users will have many of the same challenges they confront when enabling a cloud infrastructure.
IoT is creating a flood of new technology and driving communication and computing to the edge of system architectures. The number of connected IoT devices worldwide will increase 12 percent on average annually, from nearly 27 billion in 2017 to 125 billion in 2030, according to new analysis from global information provider IHS Markit. This is particularly the case with smart sensors that have embedded processing and communication to controllers, enterprise, and cloud servers. These IoT devices are uniquely identifiable electronic devices using Internet "data plumbing," including Internet Protocol, Web services, and cloud computing.
An example of this trend is the Industry 4.0 for Process Automation initiative started by NAMUR. Industry 4.0 and IoT concepts are being applied to process automation to achieve a holistic integration of automation, business information, and manufacturing execution function to improve all aspects of production and commerce across company boundaries for greater efficiency.
Does edge computing pose any unique new cybersecurity challenges, and do you have any advice for users?
Kling: With more powerful processors, embedded sensing technologies, increased abilities to communicate, lower power consumption, smaller footprints, and mobile applications, we can start to take an application that used to run on a server and run it where it makes the most sense. At the end of the day, what we are talking about is pushing control further toward the periphery of the plant, right down to the equipment asset level. With more connectivity and computing power, these smarter, connected assets, like pumps, for example, will be able to control, monitor, and secure themselves in real time. And if we take the next step, it is easy to imagine extending this level of real-time control upward to the enterprise. It will revolutionize how companies improve the profitability and performance of their operations and assets. But regardless of what it looks like, be it cloud, fog, or edge, a robust cybersecurity strategy will have to be in place, because, as we said, all this new connectivity broadens the attack surface. Every new connection has to be secured. This will be the challenge.
Edwards: The scale we are going to see with these deployments creates a massive asset-management problem. I mean, if today we can't even identify what devices are currently on our ICS networks, what will it be like when we have two orders of magnitude more devices? Seems like an opportunity for a robust "management of change" type system.
In the traditional architecture of digital services, applications are tightly bound to the platform on the operating system (OS). Virtual machines began a revolution to loosen the tight binding between OS and platform. Containers are taking that one step further. Now services are becoming loosely bound to their guest OSs. This unbinding allows for an increase in computing fluidity. It becomes far easier to leverage cloud, fog, and edge computing platforms as the application can move easily between environments.
Does virtualization pose any unique new cybersecurity challenges, and do you have any advice for users?
Kling: Virtualization means we can now use applications we no longer have to install and customize to fit their platforms. Applications, services, and microservices are preinstalled on a virtual machine. Essentially, they are their own platforms. Isolated from their neighbors, they inherently bring security improvements. Maintaining the VM [virtual machine] repositories securely and using them in a secure fashion by ensuring integrity is a somewhat new challenge. But with these new challenges comes an incredible amount of value.
Once we see network convergence, network virtualization, and application and service virtualization, paired with traditional IoT and sensors, we will witness automation solutions that carry a lighter physical footprint. You can take advantage of these virtual resources on premise or off premise. It is entirely conceivable to imagine a rack of virtualized computers replacing control, I/O processing, and other applications. Think of a rack of compute power tied to an array of edge-based sensors and actuators. But once again, you must execute your applications where it makes the most sense, i.e., where it drives the most value within your risk threshold. A new vision is coming, one that leverages value from virtualization. With it comes the challenges unique to the technology being used. For example, know and understand how network convergence places higher importance on confidentiality and integrity. Be ready for communication prioritization schemes to rise in importance to help ensure critical traffic is treated appropriately to maintain availability.
Edwards: Virtualization presents similar issues as the cloud, fog, and other advancements we have discussed. Something I really like about virtualization is the ability to separate the software from specific hardware dependencies. As an old DCS guy coming from "the software only works on this specific hardware version," that is a huge benefit, and gives end users enormous flexibility and redundancy. But yes: You need to be aware of new vulnerabilities that come with virtualization. My advice here would be don't mix security levels on the same VM hardware. You need a unique set of hardware for each security level.
Analytics, machine learning, and artificial intelligence
The application of technology to improve and optimize production operations has been an ongoing industrial automation journey over the years. Cloud, fog, and edge computing and software developed for a wide range of IT, Internet, scientific, and business applications have become easier to use and more cost effective for industrial automation applications. This does, however, connect production processes directly to a broader number of networks and computers.
Does the broad application of analytics, machine learning (ML), and artificial intelligence (AI) pose any unique new cybersecurity challenges, and do you have any advice for users?
Kling: Absolutely. In traditional DCS or discrete applications, the control algorithms have a precise understanding. Control engineers have been trained in these algorithms. Machine learning and artificial intelligence bring a new level of discerning patterns from data and offer new ways to improve the safety, efficiency, reliability, and even profitability of the operation and the business. But before a single operator decision is made, time must be taken to understand these new algorithms and to ensure the integrity of the data being fed into them, so they can explain their results. Only then can confidence be found.
From a security standpoint, as mentioned in the cloud discussion, the integrity of the data is paramount. The convergence of data availability via an increase in sensor technology, our ability to move that data, and now the compute resource made available to act upon that data have reached a point where ML/AI are viable. Securing the acquisition at the lowest levels and ensuring the integrity of that data is essential to using it securely.
Edwards: Access to powerful computing platforms is a big win for advanced control. The industry will continue to see unique optimization opportunities that we could only dream of before. Having that much data in one place for the algorithms to eat for breakfast, though? That could pose a challenge from an intellectual property perspective, so I think even these applications have to be thoughtful from a security perspective or you might get into trouble.
Cybersecurity is big and getting bigger, and the level of complexity is rising. One way to overcome the complexity of securing disparate systems from multiple vendors is to join together and collaboratively share knowledge. This is one of the primary ways we as an industry can grow to be more effective.
Kling: Taking on new and increasingly dangerous cyberthreats can't be limited to a single company, industry, or region. That's why everyone associated with industrial manufacturing-suppliers, end users, third-party providers, integrators, standards bodies, and even government agencies-must come together. We need to collaboratively develop new ways of ensuring legacy and emerging technologies alike can withstand sophisticated cyberattacks.
On the whole, our industry is generally pretty conservative, but we have to change that culture when it comes to cybersecurity. The most effective way to do this is to encourage a collaborative, three-pronged approach that focuses on people, processes, and technology.
First, we have to work together to make sure everyone-everywhere-knows they are responsible for cybersecurity. This includes ensuring everyone is trained, with defined, clearly understood roles, responsibilities, and procedures to prevent, mitigate and, most importantly, respond to cyberattacks.
Second, we have to work together to establish best processes, practices, and policies, especially as it relates to performing regular risk and threat assessments and gap analyses. That approach is proven to identify holes in our systems and our overall security posture. Additionally, there is an opportunity for industry to work together to help end users contain, mitigate, and even prevent the spread of any virus and malware via network segmentation, the application of zones and conduits, and the establishment of other processes. This includes strengthening an industrywide commitment to adhering to best practices, especially a drive to remain compliant with prevailing, most-current industry standards, like IEC 62443.
Third, we need to find ways for suppliers to work together to strengthen their products with today's threats in mind. Keep in mind that end users are frequently using a mix of systems from various vendors and vintages. Can we collaborate and evolve our technology to help them address cybersecurity issues in their frequently complex operating environments? The answer is yes, but it requires a cultural shift and a strong commitment from industry leaders.
It really is time for industry as a whole to step up. By collaborating openly and transparently, we increase our collective ability to protect the world's most critical operations and the people and communities we all jointly serve. Let's get it done.
Edwards: This is an area that can really help advance cybersecurity. We are all in this together, and cybersecurity should not be a competitive differentiator between vendors. If we could truly come together as an industry and share the information about threats and attacks with each other in an open yet safe environment, then I think we could all advance our capabilities to defend against these things. This is sort of a hot button for me, coming from my ICS-CERT background. I am optimistic that we will follow some of the trends of other industries, such as the financial services industry, where this type of information sharing is very well accepted and functions with a very high level of success.
Thank you to Andy Kling and Marty Edwards for sharing their knowledge and thoughts on industrial cybersecurity. The International Society of Automation has a wide range of industrial cybersecurity training resources (see www.isa.org).
We want to hear from you! Please send us your comments and questions about this topic to InTechmagazine@isa.org.