March/April 2013
the final say | Views from Automation Leaders

Back to the future: Trends in industrial control systems

By Jim Rogers

Do you remember the movies in the 1980s where the scientist Emmett "Doc" Brown invents a DeLorean time machine? Doc and his young friend Marty travel back and forth through time, recognizing the same basic people, places, events, and artifacts. In each era those things have a different form, and a few new characters are introduced along the way.

Looking back through the evolution of modern day industrial control systems (ICSs), one can picture a similar sort of tale. For the sake of simplicity, let us follow the evolution of the ICS by broadly categorizing the dominant architectures of each decade.

Our journey begins with simple, manually operated 0-15 psi pneumatic single control loops up through the 1950s. In the 1960s, the general computing world saw the use of mainframe computers with multiple serial connections to individual user terminals. Computation was done in a batch fashion as users submitted job requests to the queue. The industrial control world was migrating from pneumatic to 4-20mA electrical controls.

The 1970s brought us the programmable logic controller (PLC) to replace the functionality of hard-wired electrical relays. Smaller versions of mainframe computers, such as the Sperry-Univac 1010, were used for direct digital control (DDC). This was an era of centralized control, similar to the mainframe/terminals architecture. At the same time, we saw the rise of the single-loop electronic digital controller.

Then came the 1980s, which brought the distributed control system (DCS). Who can forget the venerable TDC 2000 introduced by Honeywell and quickly adopted by the refining industry? So began the long - and still prevalent today - PLC versus DCS debate. And don't forget the mighty mini computers, such as the DEC micro-VAX, on which we could run advanced control programs written in C. Control loops now were de-centralized.

By this time there was a plethora of proprietary computing architectures and a problem that came to be known as "Islands of Automation." The 1990s ushered in a relentless push for standardization so that we could somehow connect all these disparate systems together to achieve the best-in-class, plug-and-play control system we longed for. This was the time of the digital fieldbus, and the resulting "Fieldbus Wars," and a drive to push control from the distributed controllers out to the field instruments and final elements. What? Single loop control? What happened to centralized? Or distributed?

We were in a quandary. The 2000s desperately needed worldwide standards for interoperability, not just connectivity. The rise of the Internet, Ethernet, service-oriented architecture, and wireless communications all found their way into the ICS. Add the trend toward embedded systems and ever-smaller form factors and a move toward less hardware and more software, such as with the soft PLC.

Today we have cloud computing, virtualization, BYOD (bring your own device), self-forming and mesh networks, and IOT (the Internet of things!) But wait. Isn't cloud computing with BYOD a little like having a big (virtual) mainframe on the back end with individual user terminals on the front end?

What we need here is Doc Brown to sort all this out and explain it to us. Which architecture is the best? Centralized? Distributed? Single-loop field-based? PLC? DCS? Embedded? Wired? Wireless?

The answer, of course, is "All of the above," according to the business need. And we don't really need Doc Brown. The point here is that there is not one "best" system or architecture. There is room for all the technologies. Part of the job of automation and control engineers is to help assess the business and technical needs of the users and processes, and to architect solutions that meet those needs.

The key to making the available technologies work together as an integrated system is interoperability of the applications. That is, data must be able to flow seamlessly between systems and be transformed into useful information. This is where standards can be of assistance.

So what does the future hold? To extract the maximum value from the ICS the objective is to get:

1. The right data
2. to the right people
3. at the right time
4. in the right format.

But wait. Isn't that what the promise of CIM (computer integrated manufacturing) was back in the 1980s? Maybe, then, if we want clues as to what might be in the future of the ICS, we could take a look back to the past!

ABOUT THE AUTHOR

Jim Rogers (Jim.Rogers@apachecorp.com), CAP, is an automation advisor with Apache Corporation in Houston, Tex., where he is currently working in the World Wide Drilling group applying automation technologies to drilling operations.