Empowering the enterprise level with historians
Improving performance with information
By Michael Bowbyes
The benefits of enterprise-level historians in the process industries are well recognized, and their roles within plants continue to expand; whether that potential is being fully realized by many businesses, however, is a question worth exploring.
Over the years, access to timely, accurate, and clear information has proven to inevitably lead to better, quicker decisions that improve process performance, efficiency, and safety. Consequently, the historian’s role in gathering, archiving, storing, organizing, analyzing, visualizing, and disseminating plant and business data continues to expand.
Long-term trends are only increasing the importance of historians. The availability of data from control systems and elsewhere has exploded in the last two decades, often coinciding with declining head counts and growing skills gaps in the workforce. Fewer people and more information lead to a more pressing need for tools to transform data into actionable intelligence.
That need is also driven by the increasing desire to make business decisions based on real-time data. And without the historian to sort, contextualize, and clarify the data, the task becomes impossible. Likewise, historians are the foundation of most predictive maintenance and condition-based monitoring programs that are seeing increased interest.
Finally, regulatory drivers are obliging plants to invest in solutions to ensure accurate storage of process data history; the ability to simplify and automate reporting for compliance and business purposes is an added incentive.
The historian also plays an important functional role in safeguarding the control system and ensuring smooth operation. In historizing data from the distributed control system (DCS) separately to the control system, for example, it allows users to access the data safely. Since users of the historian have no need to access the DCS, it is simultaneously protected from malicious damage and the threat of overload due to data requests by casual users.
Most important, however, there is now an impressive body of evidence of the benefits to plants, where historians have been used for more than three decades, and, increasingly, at the enterprise level.
Internal figures suggest that, correctly implemented, historians and related applications can reduce process incidents by 40 percent, cut off-spec material in half, improve throughput by up to eight percent, and cut energy use by up to ten percent. Similar improvements are found in plant availability, maintenance costs, and customer service levels. Improvements in compliance—and their role in continuous improvement programs—are harder to quantify but no less real.
It is undeniable that historians, at their best, turn data into intelligence, empower better decisions, promote better collaboration, and have a demonstrable impact on profitability. They are far from fail-safe, however, and plants would be wise to understand their limitations and how to overcome them.
Much depends on the architecture implementing the historian, which can significantly limit the benefits achieved.
Few deny the limitations inherent in the point-to-point architectures that characterized early solutions. The principal role of historians is to enable data to be shared between users and application, rather than simply stored for posterity. Under a point-to-point system, applications share this directly—usually by way of proprietary and complex logic embedded in each—in order to convert the units to an acceptable format for the connecting application.
It is an inflexible and costly approach, and, as the role of the historian has grown, the limitations of point-to-point architectures have become ever more apparent. Plants might end up with thousands of individual connections, for example, with each addition or application requiring new coding. Savings from improvements in efficiency or performance from the historian are therefore eroded by the solution’s high cost of ownership.
In this respect, the move to the data warehousing that characterizes most deployments today is welcome. Consolidating data to a central point to provide access to all users and applications that require it is an obvious answer to the complexity of point-to-point systems. It would also seem to address businesses’ concerns about the propagation of islands of information within the organization.
Again, however, it comes at a price, and data warehousing has proved to have its own limitations. The costs of the move to a monolithic data store can mean a questionable return on investment—and more so due to the inevitable quantities of redundant data stored. Furthermore, the historian in this model potentially represents a central point of failure that could have a profound impact on operations. And, again, as the range and volume of data grows inexorably, these weaknesses are increasingly apparent.
Quality before quantity
Achieving consistent, reliable data requires the right approach in both the historian and connected applications. In some respects the requirements are technical. Data quality, for example, remains key to the success of historian deployments, yet many traditional approaches continue to fall short.
Data confidence should be carried through right from the source to the final report or performance indicator. As well as sifting out errors and bad data points, tying the reliability of data points to the figures stored so that it is reflected in any calculations or analysis significantly adds to its value.
This approach means users can choose to exclude values that are not marked with complete confidence, or have them included but with the variability highlighted. An Excel Companion user seeking a daily average for a flow indicator that was down for six hours over the period, for example, can receive the value, but marked with a confidence of 75 percent. Similarly, with a virtual tag calculating total inventory levels from the volumes of five tanks, where one of the readings shows the tank above its maximum capacity, the historian will clamp its value at the limit and mark it with zero confidence. The total level returned will therefore be given a confidence of 80 percent.
If data quality is not fully integrated—either through a failure to record it or due to being stored as a separate tag that is not carried forward to calculations and aggregates—businesses face working without values where perfect data does not exist or working with questionable values with no indication of their reliability. If the latter option is chosen, the result is not simply that users must doubt the reliability of figures where perfect data does not exist; they must doubt all figures, since they have no way of knowing which are based on perfect or imperfect data.
The focus on data quality applies elsewhere, too. Conversions are another area best tackled at source. Embedding unit conversions within the historian, so that it converts figures to units appropriate for the user or application requesting them, has distinct advantages. For instance, the process is automatic, so users have no need to know conversion factors or formulas, which reduces work and the risk of errors. Additionally, conversion logic will be consistently applied, discouraging different approaches and the development of islands of knowledge.
Opening up data
While data quality is very important, availability and access to applications is equally necessary.
Ideally, applications and users should be participants in a broad information bus. The whole host of applications, people, and data sources in the organization should be able to easily interact with each other and quickly and efficiently exchange the information needed.
There are several prerequisites. For example, it requires applications that can securely transport information across firewalls. It also requires applications can service requests for information and data to multiple consumers, preventing bottlenecks or single points of failure in the architecture—without the need for proprietary interfaces between applications.
Fundamentally, this can only happen if there is a commitment to open standards that is reflected in not only the historian but also the visualization, analysis, calculation, and collaboration tools using the data. Moreover, it is only practical if the approach to pricing reflects that commitment.
Adopting this approach inherently draws businesses away from a focus on data warehousing. The historian still has a role to play in consolidating information needed for performance, calculations, security, and compliance, and ensuring the information stored in other databases is consistent. However, if both the historian and associated tools use open standards, it is no longer necessary to bring every source into a central database before it can be used.
For example, information from relational databases, such as Laboratory Information Management Systems data or an alarm and events database, no longer needs to be consolidated into the central database before calculation and visualization tools can be run on it. With open standards, the tools can process the data directly from the source. Visualization, analysis, and collaboration tools are no longer tied to or dependent upon the enterprise historian and can draw information from distributed systems of historians throughout the enterprise. The application can draw measures from any or all of them and write back its results to whichever database is preferred for that particular information. Likewise, where the data is centrally stored, a historian based on OPC or other open standard allows any application, rather than only proprietary tools, to access, use, and feed into it.
The benefits are manifold. First, the full range of data existing in the enterprise can be used, as can the full range of tools it has to make sense of that data and turn it into useful knowledge. The value of legacy investments is therefore retained, while the system is much more flexible and accommodating of additions. Expansions and modifications are simpler and cheaper since new data sources and applications can be easily integrated. That is particularly true in comparison with point-to-point solutions, but also with regard to warehousing, since users are not tied to any particular provider.
Furthermore, moving away from data warehousing potentially cuts the delay between an event and the organization’s response. Information from an event no longer has to be centrally archived before it can be analyzed to produce the reports or KPIs that prompt action. An open information bus reduces the latency in traditional approaches, so that as soon as the business or operational event occurs, people and applications are notified instantly.
Process and chemical engineers, for example, still require extensive historical information for complete offline analysis, but they may also want instant notification of certain conditions that may be identified by just a few variables. In an open information bus, data is captured and analyzed in real time while at the same time being archived for later use. Likewise, the visualization tools not only provide access to historical data but can also subscribe and visualize real-time data.
Finally, in other cases the approach removes the need to consolidate all data into a central database before it can be processed—meaning less duplication, less redundant data, and, consequently, lower storage costs.
The cost base
Cost, however, does remain central to the equation. The possibilities from open standards can only be fully explored with a charging structure that permits them. There is little point in enabling integration of various third-party data sources and applications if the price of doing so is prohibitive.
That applies in two respects. First, pricing licenses by the number of tags or data points inevitably restricts the amount of data the system can draw on, a problem that is aggravated by the requirement to plug everything into a central database. That limits the potential benefits, since enterprises face trying to calculate the return on investment before incorporating any additional data source.
Data access licensing has the same effect. Even where historians have the functionality to allow third-party applications to connect, charging for this pushes up the costs for expanding the range of applications employed and for leveraging existing data assets. Again, businesses face forecasting the return on investment of integrating an additional application without, necessarily, the benefit of experience using it. To fully grasp the benefits of open standards requires open interface licensing for the historian that allows enterprises to experiment with the available tools.
Crucially, these arguments will grow stronger as the role of historian increases. Neither the growing availability of data nor the increasing importance of tools that can make sense of it is abating. The increasing sophistication of historians contributes to making the plant and business ever more reliant on them and, in some cases, confidence in the quality of the data is misplaced. Open standards is the best way of ensuring that reliance is not misplaced.
ABOUT THE AUTHOR
Mike Bowbyes, PMP, is a project management professional with a background in computer engineering and more than 16 years of experience in industrial automation and information management systems. He has extensive experience in the design, management, and support of process historians and their related applications. Bowbyes now serves as the product manager for Honeywell’s portfolio of products related to process and event data capture, archiving, analysis, and visualization. For more information, contact firstname.lastname@example.org.