As automation in biomanufacturing becomes more important, so does the need to integrate process data.
By Feliza Mirasol
Biomanufacturing processes have been automated for years, and use of automation is expected to grow significantly in the near future. As a result, data integration is needed to handle the huge increase in product data from this growth to ensure that these raw data become information that can be used to control and improve the process and product.
For more than 15 years, biopharmaceutical manufacturers have been implementing automation and control systems, both upstream and downstream, says Christoph Lebl, head of Global Automation and Controls, Lonza Pharma & Biotech. Automation has traditionally focused on controlling the manufacturing process throughout production to ensure that product remains identical between batches. Control has been manifested mainly in minimizing manual interactions and preventing operator error, to prevent variability in production and in product quality that would require correction.
Today, he says, automation is poised to play a much more visible role in biopharmaceutical operations, particularly in moving material between sites in the production process. Currently, this is done manually, Lebl says, and this has a direct impact on biopharmaceutical labor costs, which account for 40–50% of production costs, compared with 10–15% in other industries. “A large proportion of this cost goes to having people physically transfer material and products from place to place,” he says. As robotic systems become more accessible and less expensive, Lebl sees them playing a bigger role in pharmaceutical manufacturing, especially for material movement.
Besides data storage, analytics, and generation of process data, especially in single use (SU) processing, sensor incorporation in the biobag design is important. This allows for direct measurements without jeopardizing the sterility of the bag. One supplier has extended and improved its sensor portfolio, by, for example, including a sensor that measures biomass in the United States Pharmacopeia bags used for rocking motion-based bioreactors and stirred tank single-use bioreactors.
Integrating automated systems
As the need for data integration increases, it is essential to optimize the flow of data among different automated process lines to ensure that it can be collected and used. “It is crucial to apply the same level of automation and operational philosophy across the entire production, from seed lab to final formulation,” says Lebl.
Ensuring traceability is key, he emphasizes. For example, if an equipment component that is used in production is later suspected of having contributed to quality problems with one specific batch, automation systems like manufacturing execution systems (MES) are there that provide the traceability needed, locating the impacted production batch swiftly and mitigating negative impact.
“Experienced manufacturers operate with integrated data systems, so all production data are available in one place. Without an integrated data system in place, it can be nearly impossible to access data quickly to assess whether a batch is on track—and to intervene if it is not,” Lebl says.
Currently, the biopharmaceutical manufacturing and supply chain boasts high levels of integration throughout. According to Lebl, enterprise resource planning (ERP) systems encourage dialogue between warehouse and shop floor to ensure on-time material consumption and production. “Integrated systems help deliver the right materials to the floor in the right quantity to match the needs of the customer in terms of drug product delivery,” he states.
However, Lebl sees a need for better automated integration of operator training, for real-time verification to ensure that training is up to date. Currently, he says, this is typically done manually. Another highly manual process today is scheduling. It is in many cases based on paper, relying on humans to connect the performance data from the equipment with planned adjustments.
Smart modular data integration packages can be integrated easily and quickly into distributed control system (DCS) and supervisory control and data acquisition (SCADA) systems. This enables faster build up-times and changeovers and allow the modular skids to be used in flexible manufacturing, according to a supplier. Enhanced speed and flexibility, in turn, reduces capital and operational expenditures (CAPEX and OPEX).
Normally process-related data is gathered and stored on the process management level in a historian, such as OSIsoft PI (PI Process Information system). From this, historical data can be used to perform sophisticated data analytics.
After control, the next use for automated system data will be predictive analytics, says Lebl. “This approach can fine-tune the production process, and flag and correct issues before they arise,” he says.
He also sees a growing role for real-time release (RTR), which uses analytics to test batch quality automatically during the production process, which could significantly reduce the time needed for final qualification steps and product release. “RTR entails higher up-front cost for the equipment and start-up/validation, but allows batches to be released much faster with less human intervention,” Lebl says.
The need for integration between and among automated systems also has an impact on data collection and information flow. “One of the most prominent threats visà-vis automated production is the potential for compromised data integrity,” Lebl cautions.
When data are shared more openly and automatically, they become potentially vulnerable, he says. To ensure security, therefore, companies are putting much effort into information technology (IT)/operational technology (OT) infrastructure and cybersecurity. “Some pharma companies are teaming up to create standardized approaches to IT/OT network infrastructure design and setup to prevent unauthorized intentional and unintentional access,” comments Lebl.
Easing data flow
The potential for higher volumes of data collection and the need for more efficient, seamless data flow are challenges that manufacturers face. Service providers will need to incorporate solutions for easing data collection and flow.
“Our ultimate goal is a ‘plugand-play’ system of automated bioproduction in which vendors, suppliers, and manufacturers all use equipment that ‘recognize’ each other, allowing for simple integration and data sharing,” says Lebl. “At the moment, for each piece of production equipment that we bring online, we need to create customized software and interfaces—that leads to higher costs and longer production time.”
As part of its solutions troubleshooting, Lonza participates in the BioPhorum Operations Group (BPOG) initiative, a company-to-company consortium that aims to develop best practices and user requirements between manufacturers and supply partners. “As we work to develop consensus around approaches and standards to production, the industry stands to benefit, along with the people we serve globally,” Lebl says.
Further, process historians are needed to store and transfer data safely to allow fast and easy evaluation of data within past and current batches, and to allow comparisons between batches, a supplier adds. For example, a real-time monitoring system can extract data and use a statistical model to create user-friendly process trajectories that show process consistency. In this example, operators can use the real-time monitoring system to spot process variance early and perform a graphical root-cause analysis on the plant floor, allowing them to detect the source of the problem. This would thus prevent process deviation and lost batches. Having the ability to do this on the fly allows operators to bring a process back under control before variance can turn into a critical quality process deviation.
Integration of a modular package unit in an automation landscape was a big effort in the past. The introduction of classical open platform communications (OPC) has been partly simplified integration. OPC unified architecture (UA) is being implemented as the next generation in integrated automation infrastructure, more so for modular package units and SCADA and DCS systems nowadays, according to a supplier.
Though classical OPC and OPC UA provide a good backbone for data transfer, there remans little standardization on the content provided, the supplier points out.
Several working groups are currently working on standards and definitions for interfaces for modular package unit and SCADA/DCS systems. These initiatives have the common goal of improvimg interoperability of modular package units to SCADA/DCS system, to reach integrations with reduced effort, to enable reduce factory build-up times, and to enable fast changeover between processes, the supplier says.
In addition to BPOG’s work, the International Society for Pharmaceutical Engineers, the Standardization Association for Measurement and Control in Chemical Industries, and the German Electrical and Electronic Manufacturers’ Association (abbreviated NAMUR and ZVEI, respectively in German), are collaborating on modular systems and are working on various plug-and-manufacture approaches.