1 April 2002
Web technologies help hike batch quality
By Chinmoy Roy and Leonard Johnson
Genentech Inc.'s bulk manufacturing facility in Vacaville, Calif., is young and flexible and has a sophisticated distributed control system (DCS) that can trace and track the multitude of process activities required for the production of a single batch.
So Genentech raised the bar.
To meet new and increasing manufacturing capacity requirements, an effort is under way to maximize yield and plant throughput. There is recognition that by decreasing the time for postproduction analysis of batch production data and generation of a batch assay history report prior to its market release, capacity will increase.
Yields will rise by using real-time preemption of deviations in batch quality while the batch is in production. Such time-demanding requirements are met by using the Web to deploy raw data and processed information to fulfill users' data needs-when, where, and how they need it.
Web technologies for supervisory batch control, batch production data analysis, and batch report generation are the keys.
PRESENT RAW DATA
In mid-1999, Genentech successfully commissioned a new, multiproduct, biotechnology drug manufacturing facility. Design features such as plant layout to match product flow have contributed to maximizing plant output. But the quest for increased productivity is a constantly evolving activity and does not terminate with the commissioning of a plant. There is an increasing realization that productivity gains occur if one leverages the Web to provide enterprisewide production information faster and more efficiently.
Considerable improvements in operational efficiency, yield, and quality control and reduction in batch release cycle times may transpire by capitalizing on evolving Web technologies and the Internet to present large volumes of raw data in a just-in-time manner.
The information technology infrastructure consists of several operating systems, from Unix to Windows 9x, Windows NT, and beyond.
A DCS controls manufacturing. The DCS architecture consists of a central server, which communicates with field controllers via a dedicated Ethernet-based network. The server executes recipes and orchestrates the phases and control modules in the field controllers.
|ANSI/ISA-88.01-1995, Batch Control Part 1 provides models and terminology applicable to batch control. Part 2 addresses data structures and guidelines for languages. Although this standard is primarily for batch processes, there may be considerable value for other types of processes.|
Besides providing an execution platform for recipes, the central server also performs batch management, process control data acquisition from field controllers, process alarming, and trending and gives operators the ability to initiate recipes and interact with the process via the human-machine interfaces.
The corporate communication network spans two major company sites 80 miles apart. It provides ready access to DCS production data and recipes to those who are directly engaged in production such as the material management group, manufacturing quality assurance, production management, and others.
A firewall provides the DCS with the required protection from unauthorized access over the network. Also resident on the network is a production operation management system (POMS) server. There are several enterprise resource planning (ERP) servers on the corporate network.
The application programs in the POMS/ERP servers provide for production planning/control, inventory management, and maintenance of lot genealogy.
The DCS server communicates with field controllers to load phase and recipe variables and orchestrate the execution of a batch. It also initiates the real-time acquisition of batch history data and process trend history data. The batch history data is stored in its own server.
The trend history data is initially stored in the DCS and then moves to its own server. This frees the DCS to attend to control-related tasks. The DCS is responsible for automated process control and batch record maintenance.
MANUFACTURING BATCH INFORMATION
Various groups within the enterprise need access to a single batch's manufacturing data after it finishes. They review what transpired in during its manufacture to assess the suitability of the product for market release.
A manufacturing information system captures the manufacturing data while the batch is in production. This system also must present the data to the enterprise information system and hence should provide a seamless interface to external entities.
The use of ANSI/ISA-88.01-1995 batch standard (ISA-88 Part 1) in the design and implementation of the manufacturing control system facilitates the design of a manufacturing information system. The ISA-88 Part 1 control activity model (CAM) provides the necessary framework to determine the appropriate control layers in which the data interchange functional modules embed.
The process management layer seems a likely layer for information exchange functionality between the manufacturing floor and the enterprise information system to reside. This layer may serve as the repository of several middleware and customized software modules that execute to capture manufacturing floor data and integrate it into the enterprise information systems.
These software modules run on standardized technologies such as OLE for process control and SQL and may be resident in the component servers. In the early to mid-'90s, vendors introduced a range of software products for the interchange of such data.
AMR Research, a Boston consulting group, termed this suite of software products manufacturing execution systems (MESs).
The software systems in Genentech's information system execute in the ISA-88 Part 1's CAM layers of process management and above. Together they provide the information exchange interface for the recipe, the schedule, and the production history information.
The MES provides the recipe interface functionality. It serves as a link between the ERP system and DCS resident control recipes. Shop orders generated by the ERP system let the MES assign unique lot numbers for control recipes. After the MES assigns a control recipe its unique lot number, it executes in the DCS. Via bar-code scanners, the ERP system also verifies the IDs and validity of each production item, such as filters or bulk kits, prior to their release to the production floor from the warehouse staging area. The scanned data then enters the control recipes for point-of-use verification of these raw materials.
The production history information interface collects information such as raw materials consumption and new material produced from the DCS. It transmits this data to the MES for onward transmission to the ERP system.
The ERP system uses the data received to update its material inventory and lot genealogy databases. The interface also provides for a real-time capture and transfer of all batch record history data to a batch history data server. It also captures and transfers process trend data to a process trend data server.
COLLABORATIVELY REVIEW BATCH
The key component of the information model that gives users the data where they need it is the client desktop application, where manufacturing information displays. These desktops scatter across the enterprise.
They carry Internet browsers to provide server access via an enterprisewide communication network. Users access the servers by typing in a URL address, such as http://dcsrptserver.
All users who gain access to the enterprise network may access the report server. The report server functions as a Web server. It services operator requests by accessing data either from the batch history server or the trend history server.
A subset of users, through a level of access security, may collaboratively review batch records prior to their market release. These reviews work via frequent electronic exchange of information and answers to queries between quality and manufacturing personnel.
Users may select a specific batch ID from a search results list of completed batches or batches completed during a specified date range and generate a report. The report lists events that may have occurred during the production of a batch. They also display the cause and duration of delays caused by pending operator actions or units awaiting the availability of other process units. The system flags such delays, reducing the analysis time.
Maintenance personnel may view equipment usage history reports to assist them in proactive maintenance of equipment, thereby improving equipment availability.
Process engineers, manufacturing, and process science personnel can review batch records and process performance with the idea of improving the process yield.
Because information is now available in a centralized location on the report server, the review takes place from convenient office desktops and at a pace that dovetails with other work responsibilities.
The real impact is the reduction in batch approval cycle times, optimization of material inventory and usage, maximization of asset utilization through proactive scheduling of equipment maintenance, electronic generation of work orders, and access to thousands of manufacturing-related documents.
The key is to provide the required information to those who need it when, how, and where they need it.
Web-based systems allow users to assess the validity of the data reported from the system. The ease of use not only saves time from previous Genentech systems (from 10 hours to collect relevant data down to 1 hour) but also allows for investigations of process anomalies that may have been deemed too resource intensive to perform previously.
Previous data mining operations that required personnel to scan batch reports to determine batch run times could take 30 days for only a certain category of recipes. Now, run times for all recipes can be calculated and statistically analyzed in minutes.
Quality assurance reviews originally took place at batch completion, which required hours of additional review time for typical batch executions. Now batch reviews can take place as the batch runs, minimizing the review time after batch completion and increasing the reliability of information during investigations of untypical executions. IT
Behind the Byline
Chinmoy Roy and Leonard Johnson work for Genentech, a developer of biotechnological therapies for medical needs. Roy is manager of automation engineering, and Johnson is a DCS administrator. This article emanates from their paper, which was presented at the World Batch Forum's North American Conference. Contact Kathy@wbf.org.
Return to Previous Page