Ensuring that analytical instrumentation readily meets the continually evolving requirements of biopharmaceutical research, development and manufacturing is an ongoing challenge
The rapidly evolving biopharmaceutical industry is already delivering radical new products to treat and/or prevent a range of diseases. There is a healthy pipeline of innovator biologics in development and, as early patents expire, an emerging biosimilars market. The pace of development is relentless, and continues to uncover new and unique challenges when taking a complex biological molecule — usually a protein or protein-based — from promising drug candidate to commercially producible therapeutic product.
Collaboration between analytical instrument manufacturers and those at the front line of biopharmaceutical development is vital to optimise and commercialise essential measurement technologies. Analytical techniques that can be used throughout the drug development process, and follow the molecule from preformulation right through to manufacturing support, are especially prized. Their value lies not simply in the insight they provide into the molecule and its behaviour, but also in delivering consistent and directly comparable data that enable the rapid detection of any changes during development and manufacture.
A critical factor that can determine the successful development of a biomolecule as a therapeutic is its stability. Differential scanning calorimetry (DSC) is both the gold standard technique for stability characterisation and also a good exemplar of an analytical technology that is being shaped to more closely meet industry requirements, some aspects of which are examined in greater detail here.
The process of developing a biotherapeutic (Figure 1) demands intensive research, characterisation and optimisation of the candidate molecule and its formulation. Final drug product attributes, such as purity, potency and dosage, are defined during early discovery and development, and are optimised throughout product development.
Selecting an inherently stable protein drug candidate reduces the likelihood of problems during manufacturing. Moreover, the molecule is more likely to resist aggregation and retain functionality during formulation and storage. A protein’s stability relies on its structure, the most basic element of which is its primary (1o) make-up, the linear sequence of amino acids forming the polypeptide chain. This chain then builds into a three-dimensional or higher order structure (HOS) that encompasses secondary, tertiary and quaternary forms.
HOS guides protein function, so is studied and monitored throughout the drug development process to understand and manage a drug’s characteristics and deliver a safe, effective product. Proteins are susceptible to both chemical and physical degradation, and the biotherapeutic development process itself imposes multiple stresses. These range from the changes in a molecule’s environment during purification through to production and formulation when heat, chemicals, pH changes, pressure, mixing and high concentrations may all trigger denaturation and/or degradation.
A complete understanding of a protein’s stability throughout the development pathway is essential for successful product commercialisation. Advances in analytical instrumentation, such as those described below for DSC, support the need for end-to-end consistency in biophysical characterisation methodologies.
The advent of biopharmaceuticals has brought major changes in the development, manufacture, storage, testing and delivery of medicines. In small-molecule drug development, the key attributes of purity and potency are measured and controlled using proven and well-established analytical techniques. Defining purity and potency for proteins harvested from cells is much less straightforward. A lack of knowledge about, and suitable characterisation solutions for, novel biological molecules and conjugates remains an issue. The analytical technology needed to address quality assurance and quality control requirements, as well as to deliver the biophysical and biochemical data essential in preformulation and formulation development, continues to evolve as the industry establishes and matures, and intelligence builds.
Moves to implement Quality by Design (QbD) protocols at early stages of biopharmaceutical development are also demanding of analytical technologies. Central to the implementation of QbD is a structured and rigorous approach to product and process development that provides a detailed understanding of the factors that influence clinical efficacy, and a manufacturing control strategy based on the mitigation of risk. When dealing with biological molecules, the challenges associated with product characterisation, process variability and stability necessitate a great deal of information gathering, so the requirement for tailored analytical instrumentation remains high.
DSC provides a good example of how analytical technology is being advanced to better support the biopharmaceutical development model. Well-established in traditional pharmaceutical R&D, DSC has long been used in preformulation for biologicals as a highly discriminating method for assessing stability, informing decisions from candidate selection through to liquid formulation. DSC results have been shown to correlate with long-term stability studies, so predictions about stability and shelf-life are possible even in early stage development. As a sensitive and reproducible detector of even very small differences in HOS, DSC is also considered to be the best technique for monitoring changes in a biotherapeutic protein’s structure, and its applications now extend into batch-to-batch comparability, biosimilarity and quality control testing. The latest developments in instrumentation and data analysis make it possible to transfer these analytical methods directly into the manufacturing space, offering the potential to check that HOS, a critical quality attribute (CQA) of the biotherapeutic, is maintained during production.
Critically, the newest DSC systems are designed specifically for use in regulated environments. They are equipped with all the building blocks for compliance with 21 CFR Part 11 and Annex 11 requirements, including controlled access, electronic signatures, auditable reports and data centralisation capacity. This ensures the high data integrity required by GxP laboratories for regulatory submissions, and paves the way for the use of DSC right through to GMP manufacturing.
With the need to analyse numerous samples, often using limited amounts of material, efficiency and cost-effectiveness are watchwords in biopharmaceutical development. Already less labour-intensive than many alternative spectroscopy based technologies, DSC is now increasingly automated, both in terms of measurement capacity and data analysis. Sample volume and working concentration are challenges that have also been addressed, with modern systems offering minimal sample consumption and operating at therapeutic concentrations of the biologic, bringing closer the goal of at-line QC testing of CQAs.
Automation has multiple benefits, not least in increasing throughput and reducing operator time and expertise, but also in allowing improved and non-subjective analysis. Automated and quantitative analysis tools allow the extraction of maximum information from the data produced. Automation improves the sensitivity of DSC when detecting HOS changes in a protein, supports the analysis of biosimilars and, for routine analyses, offers the possibility of setting pass/fail criteria. It helps to generate results that are easily understood and readily reconciled with data from complementary orthogonal techniques. Coupled with all of this is the facility for robust method development and transfer, contributing to increased productivity and greater consistency and applicability throughout the biopharmaceutical development workflow, with method transfer now possible across and between labs and sites.
As analytical instruments become more closely aligned with the needs of the biopharmaceutical industry, these same requirements will be met by data analysis and management, with new functionalities enabling the rationalisation of large datasets that have been generated using orthogonal and complementary techniques. This will be a critical step towards the full implementation of the QbD approach. It will enable developers to tap into increasing amounts of directly relevant data, forming the “big picture” of the whole drug development and manufacturing process, and leveraging past failures and successes to sharpen current processes, improve end-to-end quality and increase development efficiency.