Connected, integrated bioprocessing enterprises with greater data analytics capabilities are coming.
By Cynthia A. Challener
As upstream processing drives toward higher efficiencies in meeting demand, it requires optimizing the current state of manufacturing with the goal of achieving real-time release and an adaptive plant. Automation is necessary to achieve these goals, according to Amy Doucette, global head of pharma solutions engineering and business development for the automated products group of Applied Materials. “The confluence of big data with an increase in computing power provides companies the opportunity to conquer high-value problems, which historically were too complex and unscalable to solve,” she says.
Throughout the pharma industry, not just in upstream bioprocessing, there is, adds Dorizon Navarro, life sciences industry consultant at Rockwell Automation, an explosion of the integration of multiple different technologies to provide functional capabilities that not long ago seemed infeasible. “Industry 4.0 and advancements in analytics call on automation systems to act as data sources and smart devices for artificial intelligence (AI) and data analytic systems doing everything from preventive maintenance predictions to real-time product release,” he says.
The COVID-19 pandemic has only accelerated these trends.
Bringing flexibility and increased process intelligence
Upstream bioprocess manufacturing is being discussed from many different angles, including automation. “Connected, continuous, and process analytical technology (PAT) solutions that bring more process intelligence and quality decisions to the factory floor are becoming more common,” asserts Marc Sinclair, principal automation engineer at Cytiva. For instance, he notes that standard solutions that give the flexibility to use the bioprocess hardware in multi-product environments are expected, as are solutions that can be rapidly deployed and easily scaled, thus helping manufacturers get to market faster. A key component of those solutions, adds Chris Sandusky, director of automation solutions and product and lifecycle management for Cytiva, is more effective data management that allows for simpler, data-driven decision making.
Specific examples include the adoption of single-use systems, particularly mobile systems designed for ballroom manufacturing and solutions that address the unique process control challenges associated with continuous processing, according to Navarro. New therapeutic modalities, most notably personalized medicines, also present unique automation challenges that cannot be solved using more traditional batch monoclonal antibody automation systems. “Logistical integrations and introduction of benchtop equipment require automation to venture into realms previously untrod,” Navarro observes. “The desire to produce ‘batches of one’ presents very special automation challenges when it comes to both automation and batch validation,” he continues. Engineered solutions utilizing closed systems and integrated disposable instruments only further complicate the functionality automation provides.
Automation creates unique sampling needs as well, according to Brian Follstad, director of upstream process development at Catalent. In process development labs, for instance, he notes that the use of high-throughput culture systems such as the ambr15 and 250 from Sartorius have driven the need for automating sample handling and analytics. “A single ambr15 experiment generates 48 samples daily, each requiring full analytics with the potential for the resulting data to impact cell culture feed decisions. Automating these activities would not only help reduce laboratory resource constraints and improve data integrity, but enable the incorporation of more advanced feed strategies involving PAT and product attribute control (PAC),” Follstad asserts.
Increased interest in PAT has, in fact, driven an uptick in the use of inline sensors for real-time monitoring to maintain control of high-quality bioprocessing and improve data analytics, according to Doucette. As these technologies have advanced and automation has increased, the quality of the data has improved and data are now better leveraged into clearer process understanding, enabling advanced process control (APC), adds her colleague Lucas Vann, global CTO for pharma solutions engineering within the automated products group of Applied Materials.
“Automation feedback control loops and advanced machine learning models can be used as part of this approach for model predictive control. Integrated process alerts help to determine prescriptive actions for critical events such as feed times or parameter shifts. The data can also be utilized to ratify which sensors are behaving accurately, including confirmation of chemometric model predictions made via Raman or near infrared (NIR) spectroscopy techniques,” Vann explains.
Blend of advances in hardware, software, equipment, and sensors/analytics
Flexible technology deployments have in general enabled products to be scaled up or scaled out more rapidly and thus launched more quickly with predictable cost, quality, and schedule, according to Sandusky. “Hardware advances give the opportunity for higher cell densities and improved scalability, software enables the flexible use of these systems and speeds their deployment, and sensors and analytics help bring additional intelligence to the process to drive faster decisions regarding product quality,” Sinclair says.
Machine learning and AI capabilities are moving to the production environment, and as mobile equipment gets more advanced and cheaper, facility floor workers are being provided access to augmented reality (AR) technology, according to Navarro. In addition, he notes that computer processing technology has advanced sufficiently to allow some complex analytics to occur at the device level. Cloud services, meanwhile, have been increasingly adopted by life sciences companies, allowing for as-a-service subscription models. “This approach can reduce capital project costs, as well as reduce overhead as systems can be maintained by the vendor,” Navarro says.
In process development labs, points to the introduction of Chris Demers, senior data process scientist – automation with Catalent automated sampling systems such as Lonza’s MAST, SecureCell’s Numera, or Flownamics’ SegFlow as important for drastically reducing and even in some instances completely eliminating manual daily bioreactor sampling. “These systems integrate with metabolite analyzers and cell counters and can be used for fraction collection for product quality or direct integration with high-performance liquid chromatography instruments for at-line product concentration measurements, blurring the lines between historical upstream, downstream, and analytics labs,” he states.
The addition of on-line sensors and analytics are also important for incorporating PAT for PAC, according to Vann. “PAT enables upstream process development teams to ‘close the loop’, meaning that a fully automated feedback control loop can be implemented to control nutrient additions to a bioreactor. The key component to all of this is software capable of integrating the variety of lab equipment and resulting data allowing for multivariate data analysis and process control,” he says.
Notable recent developments identified by Vann involve the incorporation of advanced sensors explicitly for control strategies instead of being used only for monitoring purposes. “Software is now providing the ability to implement hybrid models that combine standard data with PAT data, including mechanistic relationships. This ability has enabled machine learning (ML) to be applied to the small number of large volume batches that often exist in large-scale bioprocessing facilities. Hybrid models have also been used for predicting yields and endpoints with the integration of optimization algorithms for model predictive control in APC strategies,” he comments.
Such innovations, Vann stresses, are critical to increasing digital maturity to the point where predictions can enable adaptation to inherent process variability so that consistent performance and quality is maintained. Indeed, Doucette notes that by taking advantage of Industry 4.0 technology, including smart sensors and AI/ML, biopharma organizations can reduce mundane tasks with decision automation and focus on capturing process knowledge, reducing performance variability, and facilitating continuous learning for root-cause analysis of failures and process improvements, leading to more streamlined efforts that save time and money while still ensuring high-quality processes and products.
COVID-19 as an accelerator of adoption
The COVID-19 pandemic has impacted the biopharmaceutical industry in multiple ways. It has highlighted the dependence the world has on the industry and how important it is for biopharma to evolve by adopting tools for both integrating and orchestrating complex digital foundations, according to Doucette. It has also, notes Bruce Kane, life sciences industry consultant for Rockwell Automation, forced the industry to find creative solutions to previously unforeseen problems. “While an evolution was already happening, the pandemic has been that serious push that the industry needed to fast-track digitalization projects and adopt next-generation manufacturing technologies,” Doucette agrees.
As with the 2009 flu pandemic, Vann believes the current pandemic has driven interest in improved process understanding, performance, and quality. “Companies are looking to reduce process development times and achieve more rapid product releases. There is also an increased demand for improved scheduling across the entire supply chain where ML can be utilized to better predict demand and subsequently update manufacturing schedules to maintain target stock supplies,” he comments. Consequently, there is clear demand for advanced analytics that are integrated from the process development stage to increase the speed of the implementation of model-based control at the production scale and available as platform approaches that deliver end-to-end solutions, according to Vann.
Follstad adds that the pandemic has in certain ways also made the adoption of automation more difficult as companies attempt to balance the speed of technology transfer with new technology implementation. Even so, he says the emergence of COVID-19 has made a very clear business case for the need for automation. “In the lab, for instance, automation technologies support remote work, allowing lab technicians to focus on data analysis and interpretation, rather than collecting data within the laboratory,” Follstad remarks.
The need for social distancing has also significantly increased the need for and adoption of remote monitoring of automation equipment with the plant, according to Kane. This pandemic has, therefore, notes Demers, created a larger market for companies offering systems that enable remote support, monitoring, and AR by allowing teams of manufacturing operations support to not only have more eyes on the process, but work in a fully remote manner.
In fact, the technologies that provide these capabilities are experiencing an acceleration of functionality as more visual data, AI, AR, and virtual reality (VR) capability is integrated into the automation process, according to Kane. “The result has been the ability to employ different approaches for technology transfer that keep collected data and provide more accurate and transparent analysis through the use of advanced analytics tools and thus enable much more rapid movement of processes from product development to production and from one site to another,” he observes. Overall, he believes the pandemic helped to increase the acceptance of digitalization and the advances that digital transformation has to offer the entire biopharma industry.
Has remote monitoring been realized?
There has been a lot of buzz around the concept of remote monitoring, particularly in light of the COVID-19 pandemic. “In the COVID-19 world, remote monitoring allows more options for support of the equipment in real-time, and therefore, more manufacturers are evaluating options to remotely monitor equipment that is connected to their process control networks,” Sinclair observes.
One remote activity clearly adopted in 2020, according to Kane, is remote collaboration, the capability to have non-essential personnel support operations completely remotely. In some cases, these activities, which can include providing guidance and training to facility workers, proceed via remote connections available in the automation systems, while in others they take place using AR apps for smart phones and tablets.
Remote monitoring is a broad concept, however, and there the spectrum of advancement along the digital transformation journey is quite broad in the biopharmaceutical industry, making it difficult to reach a definitive conclusion regarding the use of remote monitoring. The adoption of the Internet of Things (IoT) and smart devices has certainly enabled quicker and more reliable remote monitoring, according to Kane. Digital technologies have also, he notes, allowed for the real-time analysis and performance monitoring of deployed equipment to realize a greater level of support and equipment utilization.
Most small to medium biopharma companies have, at a minimum, remote access (via virtual private networks) to continuous real-time data historian applications that can be used to visualize a wide range of critical process parameters but without any historical context, Demers adds. Companies further along may also have remote monitoring systems that involve predictive analytics that draw on historical trends or pull in PAT data from various software platforms to use for multivariate analysis. “The COVID-19 pandemic has created a large push for the adoption of some of these predictive remote monitoring systems, not only because of the need for remote working, but also because of the speed and agility with which everyone has been working,” says Demers.
Applied Materials has clients leveraging remote monitoring, whether by manufacturing personnel on lunch breaks or managers working from home to ensure their process continues to stay in a state of control. Progress is still needed, though, asserts Vann, particularly with respect to usability features. “Successful implementation requires the ability to meet critical functional demands based on specific user group definitions. The ability to have dashboards that provide the ideal user experience is instrumental so that monitoring is geared towards functional roles. There must be functionality that benefits everyone from high-level management monitoring key performance indicators to operators monitoring critical process parameters and acting on prescriptive actions resulting from predictive model outputs,” he says.
Challenges to adoption remain
Despite the COVID-19 pandemic highlighting the need for further digitalization and automation, many barriers to adoption of these advanced technologies for upstream bioprocessing—and across the pharma industry—remain, including ensuring the security and integrity of remotely accessed data;the cost and schedule of the automated solutions; the potential for increased process operational complexity, contamination risks, batch characterization, process integration issues; infrastructure challenges; regulatory authority resistance; and access to people with the right combination of skills and abilities.
The first step in overcoming these hurdles is to have a solid, high-quality digital foundation to gather and use data, according to Doucette. It is also important to clearly identify goals and define the best fit for the automation and remote monitoring capabilities to be implemented, Navarro adds. He also notes that industry groups such as the National Institute for Innovation in Manufacturing Biopharmaceuticals and the BioPhorum Operations Group are bringing regulatory agencies into the development of new technologies to gain guidance and facilitate adoption.
The need for new skillsets has been created by the widespread use of multivariate data analysis (MVDA) and the holistic evaluation of entire processes combined with theadoption of spectroscopic PAT, according to Demers. “The net result of all these efforts in MVDA and PAT is that we are generating data on an even larger scale, and data handling at this scale is typically the realm of data engineers and data scientists, who may have limited process knowledge. There is a need for chemometricians not only skilled at pre-processing of spectra, but who also have process knowledge. Close collaboration and knowledge exchange between data engineers and process engineers is required to overcome these hurdles,” he asserts.
To further adopt advanced automation and allow for effective remote monitoring in general, Vann believes a comprehensive platform must be implemented with a holistic approach for overcoming challenges. “While current monitoring capabilities are advancing, there is still a need for solutions that use the data generated in a more effective manner. That will require integration of different monitoring systems and functionality to perform data analytics process knowledge and understanding can be generated in real time,” he observes.
Automation continues to progress, and the biggest advances involve the ability to re-use batch data for scale up and to have equipment with functionality that gets better when combined with other equipment, according to Sandusky. He points to AR for training and operating of equipment, complete knowledge transfer from process development to manufacturing, and the application of robotics in manufacturing as key examples.
Going forward, the increased volume of data collected due to automation must be balanced with the ability to properly analyze and interpret such large data sets, Follstad adds. “The move from spreadsheets and single variable plots to databases and multivariate visualizations will require the integration of computer programming and statistical analysis experts into cell culture groups, which currently comprise only biology and engineering experts,” he says.
Achieving digital plant maturity is a journey, but companies can move toward next-generation automated pharma manufacturing today by taking a few critical steps, stresses Doucette. They include ensuring connectivity by integrating data silos; implementing prediction tools with meaningful alert-to-action, with real-time data analytics and scheduling visualization enabling transparency across the network and yielding insights for more informed decisions; and running an adaptive plant that is agile and equipped to automatically optimize schedules and control manufacturing with minimal intervention.
Technologies that make these steps possible will be increasingly platform-driven, according to Vann, even for personalized medicines such as cell and gene therapies. “Increasing the production volumes of these processes will require scale-out rather than scale-up, which changes the nature of manufacturing bottlenecks. Implementation of full platform solutions will be required to deliver control at the parameter level and allow for integration across unit operations so data can be fed into advanced sense and respond scheduling software,” he explains.
In fact, Kane envisions new technologies and solutions being developed that use modularization and allow for plug-and-play capability even with solutions from different vendors. “Having IoT platforms, smart devices, and collaborative robots (cobots) in the production environment along with the introduction of product lifecycle management (PLM) solutions will increase the industry’s ability to drive change and create more agile manufacturing environments,” he asserts.
In the long term, Kane adds that AI will be more prominent in pharma production facilities and blockchain technology could also impact manufacturing. “Blockchain will be an enabling technology applicable throughout life-science manufacturing. Anywhere data integrity and traceability are critical, blockchain could be employed to provide unalterable records. Manufacturing data, chain of custody data, product lifecycle data, and many others could all benefit from blockchain’s features,” he says.
Demers, meanwhile, sees PAT such as Raman spectroscopy being used for not just process monitoring, but bioprocess control and automation. “Longer term there will be sufficient established PAT solutions, particularly in upstream bioprocessing, to have fully ‘lights-out’ manufacturing, with samples pulled automatically from bioreactors and inline probes monitoring cell health and product quantity,” he predicts.
In general, across all manufacturing operations, Rockwell Automation sees a trend in automation toward more integration and a more connected enterprise. “The convergence of information technology with operational technology, PLM, and the implementation of digital threads is breaking down the traditional silos of past automation architectures,” Navarro observes.IoT-enabled devices, enhanced analytic tools, ML and AI, cobots, and extended-reality environment technologies will be fundamental to the future development of bioprocess automation solutions, he adds.
Indeed, automation and digital innovation will continue to be applied to new and existing bioprocess solutions. “These innovations will help differentiate the equipment by embedding more features, making the equipment smarter, and increasing the understanding that customers have of their process. These innovations will allow process optimization, process scale up, process scale out, and will ultimately reduce the time to bring products to market,” Sinclair concludes.