July 2, 2019

Addressing the Complex Nature of Downstream Processing with QbD


Quality by design brings both challenges and benefits to the development of downstream processes.


By Susan Haigney


qbd

natali_mis/Stock.Adobe.com

Regulators have been encouraging bio/pharmaceutical companies to incorporate the concept of quality by design (QbD) into development and manufacturing processes for more than a decade. The International Council for Harmonization (ICH) defines QbD as a systematic approach that incorporates prior knowledge, results of studies using design of experiments (DoE), use of quality risk management, and use of knowledge management throughout a product’s lifecycle (1). QbD incorporates the identification of critical quality attributes (CQAs) through a quality target product profile (QTPP). Critical material attributes (CMAs) and critical process parameters (CPPs) are identified through product design and understanding. Specifications for the drug substance, excipients, and drug product and controls for each manufacturing step are determined through a control strategy. The final elements of QbD are process capability and continual improvement (2). These steps combine to build quality into pharmaceutical processes and products over the lifecycle of a pharmaceutical product.

The industry has been slowly adopting the QbD approach, but with the boom in the biopharma industry, how have these QbD principles fit into the complex nature of biologics? According to Gunnar Malmquist, principle scientist, Cytiva, QbD has become an integral part of the development process for the biopharma industry. “We notice that the interest to file according to the QbD framework has cooled off during the past several years, but it is common during biopharmaceutical development to utilize and align with the principles of QbD as part of development activities. Nowadays, it is a structured methodology for how to approach product development that is not driven by regulatory need, but rather internal needs related to the establishment of process understanding,” says Malmquist.

Challenges of QbD in downstream processing

With the complex nature of biologics comes more complex quality concerns. Joey Studts, director late-stage ­downstream development in the bioprocessing development biologicals department at Boehringer Ingelheim (BI), notes that large-molecule products have more input parameters that could possibly affect quality. “From my understanding, the number of input parameters that can have a significant impact on product quality is higher in the large-molecule world due to the multiple biologically based upstream unit operations and the complexity of assigned direct correlations,” says Studts.

Therefore, using QbD to design downstream processes has its challenges. One challenge is the unpredictable relationships between molecular properties and downstream processes, but there is also the opportunity to concentrate efforts on the areas that are most important, says Malmquist. “Especially for novel molecular formats, one of the remaining challenges is the complex relationships between molecular properties and downstream processes,” explains Malmquist. “In addition, the inherent high risk for drug failure at early stages of development combined with short development timelines suggests the need for a platform approach to manage the early phases of drug development and conduct more detailed characterization towards late-stage process development,” he says.

To address this challenge, Malmquist recommends relying on process steps that are less risk exposed to avoid unpredictability. “For instance, protein A or affinity resins in general, as well as flow through steps, display these properties since these steps in most cases build on a fundamental understanding of the physiochemical phenomena involved in their performance,” he says.

The removal of product-related impurities is another challenge, according to Malmquist. “In these instances, characterization of the interplay and impact of process parameters and raw material attributes is required to fully align with the QbD methodology, which translates into extended process development and characterization efforts. The effort can be reduced by a risk-based approach where the studies are focused on the most important (or less well-known) factors.” To address this, the use of mechanistic modeling, and other emerging techniques, can reduce the experimental burden and concentrate efforts to attributes displaying the highest risks, Malmquist says.

“Typically, process parameters and their variability are well characterized, whereas the interplay between them and, for instance, resin variability may represent a blind spot,” says Malmquist. The impact of variability regarding raw materials can be felt first in commercial manufacturing because of a lack of relevant test material during process characterization. “The only viable approach to become more proactive is to engage in a true partnership with the raw material suppliers to share knowledge, get access to relevant samples to develop a successful control strategy,” he adds.

When it comes to early development, Doug MacDonald, senior ­scientist at Seattle Genetics, a biotech company that develops and commercializes cancer therapies, states that “speed to clinic” can inhibit the application of QbD. “Our typical IND [investigational new drug] timelines assume that the protein will fit the platform and therefore won’t require a lot of additional development efforts. We have definitely seen therapeutic proteins becoming more complex and have had challenges at times with the downstream process. In one case, we had an engineered antibody, which possessed properties that were not entirely amenable to our platform purification process.”

The company developed a new platform process for these types of proteins by doing a large amount of the work early on. “The knowledge we gained during that process will definitely help in the later potential scale-up and commercialization of the product; however, having more of a characterized operating space is not typical in the [Phase I] development process. It is important for QbD to link early and late-stage processes, and to address this we have changed our platform to be representative of the potential commercial process. We have also developed many high-throughput tools applicable to process and product development within upstream, downstream formulations, and analytical development,” MacDonald says.

The benefits of QbD in downstream processing

When developing processes for downstream applications, companies are using QbD to develop and track goals and evaluate risk of development processes and steps. Seattle Genetics takes a holistic approach to downstream process development, according to MacDonald. “In early development, we leverage platform data to expedite the time to produce tox material test article and clinical material. However, for later stage and commercial, we employ detailed risk assessments based on FMEA [failure modes and effects analysis] concepts to guide the needed studies and scope of work. These risk assessments are influenced by platform data, previous process characterization knowledge, available GMP data, and subject matter expertise. DoE activities are applied where appropriate models are generated to characterize the study space and possible impacts the process has on the product,” says MacDonald.

QbD should also be used to document and track the progress of process development goals, according to Studts. “We use the therapeutic and clinical goals of the program as defined in the QTPP to execute a risk assessment of the quality attributes to clearly define the CQAs for the process and use these as a basis throughout development.” Studts says CQAs, process performance, and other goals should be written in a development protocol document. Experiments should then be designed and executed so that relationships can be defined between input and output parameters.

QbD can be applied down to specific process levels as well, including the development of process materials. Cytiva uses QbD in the company’s development of resins. “Cytiva has for a long time used a Design for Six Sigma, which is built on the same principles as QbD. For resin development at Cytiva, this means that external user needs are converted to functional properties that can be measured internally during resin development. These are matched to structural properties of the resin that plays the same role as quality attributes in QbD. These potentially critical resin characteristics are always studied together with chromatographic process parameters at relevant process conditions during our development to ensure productivity and robustness,” says Malmquist.

Some downstream processes require more rigorous study than others, according to MacDonald. “Polishing steps such as ion exchange, hydroxyapatite, and hydrophobic interaction chromatography can be more heavily influenced by pH, conductivity, loading capacity, residence time, and temperature, and therefore would benefit more from a QbD approach. These steps are typically designed with more specific product attributes in mind and need more fine-tuning to achieve a goal of product or process related impurity or virus reduction,” says MacDonald.

When platform data already exist, others may require less study, such as affinity chromatography steps, low-pH viral inactivation, and nanofiltration, says MacDonald. “There can always be a need to study these steps further, and the expectation is to do so at later-stage process characterization; however, the number of parameters requiring defined operating spaces may be reduced because of the nature/modality of the step. Nanofiltration is difficult and expensive to study since the designated CPP impact on a product CQA is the viral content, which can only be tested at approved CROs [contract research organizations].”

According to Malmquist, operations that have the greatest impact on the quality target product profile get the most benefit out of QbD. Understanding how the interplay between process parameters, raw material attributes, and the control strategy may affect CQAs could potentially reduce risk and improve development speed, says Malmquist.

“An example is the topic of product aggregates that can trigger immunogenic responses. It is therefore common to reduce aggregate content to below a threshold value using cation exchange, multimodal chromatography, or hydrophobic interaction chromatography. For this kind of step, it is important to understand the process parameters such as load ratio, buffer ranges, as well as resin ligand density when developing the control strategy,” says Malmquist.

While Studts believes all process steps benefit from QbD, platform-based unit operations with previously established input and output parameters are not “actively developed with QbD principles.”

“Process steps or unit operations where the quality attributes are impacted by input parameters within the step require product-specific data and therefore benefit from an active QbD approach. Regardless whether platform or product-specific parameters are used, each unit operation benefits from having a clearly defined target or target range for the unit operation. With output target ranges clearly defined, the variability of the input parameters is tested to define a proven acceptable range (PAR). This PAR is then compared to the uncontrollable variability of the input parameter or normal operating range (NOR). With these two input ranges set and considering the equipment capability around the input parameter, the risk of this parameter is assessed, and criticality assigned. The risk assessment and criticality assignment are then the basis for the control strategy,” says Studts.

QbD in viral clearance

Viral clearance is connected to patient safety, according to Malmquist. During downstream processing, virus inactivation, virus filtration, and chromatography steps are performed. Malmquist believes designing viral safety into processes from the beginning is of “high value … delaying the testing to comprise validation and at the same time reducing the risk for surprises at late-stage process development. This can be done through linking understanding of how viruses can be cleared at different process conditions, this information can be utilized across development projects and reduce team effort and increase speed,” he says.

Platform knowledge may be used to design most viral clearance steps, and if the molecule is performing in acceptable ranges, a control strategy can be set from historical data, according to Studts. “BI sees the implementation of platform knowledge to define NORs and PARs as well as a control strategy as QbD. In such cases, a few optimally designed experiments are executed to understand the sensitivity of the specific molecule to the unit operation and the platform parameters being implemented, and a full DoE- based series of experiments are not necessary,” says Studts.

When it comes to monoclonal antibodies, MacDonald says that viral clearance processes are “widely known and understood”, with most platforms being developed for effective and orthogonal approaches for inactivating and removing viruses. “The A-Mab case study is a good example of supporting the platform approach to viral clearance (3). We are always mindful that as new data emerges, process development (PD) scientists may need to evolve their strategies. For a long time, people thought that high-pressure operation of nanofilters was considered ‘worst-case’. In recent years, data have come out justifying the opposite. In response, we have started testing our lowest operating pressure as part of the viral clearance validation package. Pressure excursions, including what may happen during product recovery buffer flushes, have sometimes been shown to lead to viral break-through during validation studies. We have adopted a risk mitigation ­strategy in response: we no longer buffer flush our nanofilters to recover the remaining product, essentially taking a yield loss on the step to ensure a quality product. We feel that this approach specifically addresses QbD,” MacDonald states.

Conclusion

Despite the complex nature of biologics, Studts believes that QbD can be applied effectively in the development of downstream processes. “With an effective data and knowledge management system and the appropriate processes and experience, the exercise can be straight-forward due to the large amounts of data that already exist on the structure-function relationships for classical large molecule drugs (e.g., monoclonal antibodies). However, more novel structures and platforms will require that the industry expand its knowledge for these types of molecules,” says Studts.

References

1. ICH, ICH Q8 R2, Pharmaceutical Development (ICH, August 2009).
2. L. X. Yu et al., AAPS Journal 16 (4) (July 2014).
3. CMC Biotech Working Group, A-Mab Case Study.

Tags: CQA, lot-to-lot variability, resin, regulatory, Quality-by-Design