Putting biotech on the map: The tools advancing peptide mapping

by ,

Large molecule drugs are valuable for both the clinic and biotech companies, however, they present a number of additional manufacturing challenges over small molecule therapeutics. Here, Amy Farrell and Jonathan Bones of The National Institute for Bioprocessing Research and Training, and Ken Cook, Suraj Patel, Alexander Schwahn and Jon Bardsley from Thermo Fisher Scientific, reveal how the latest tools are taking the complexity out of peptide mapping workflows.

The biopharmaceutical industry continues to develop protein-based therapeutics on an ever greater scale. From cytokines to growth factors, hormones to monoclonal antibodies, the growth in protein biotherapeutics has been driven by their value in the clinic and to biotech companies.

However, these large molecule drugs present a number of additional manufacturing challenges over conventional small molecule therapeutics. Their greater structural complexity and the requirement for more extensive manufacturing processes mean that robust quality control and assurance are essential to ensure products are safe and effective. As a result, regulatory bodies such as the US Food and Drug Administration (FDA) and European Medicines Agency (EMA) have established rigorous guidelines around the control of biotherapeutic production protocols.

Comprehensive product characterisation is essential to ensure these complex products function as intended and are safe for patients to use. Peptide mapping is commonly used to confirm the molecular structure of peptide therapeutics and determine post-translational modifications and sequence variants. This technique is also used to understand how these products interact within biological systems and identify signature peptides for quantitation. Liquid chromatography coupled with mass spectrometry (LC–MS) has established itself as a powerful tool for these purpose, and ongoing advances in technology mean that the technique is capable of delivering powerful insights into protein structure.

Despite this uptake in peptide mapping, the numerous, manual steps required as part of workflows have proven time-consuming and vulnerable to human error. In response, the latest advances in sample preparation, automation, separation and detection are simplifying peptide mapping workflows and accelerating the collection of reliable and robust characterisation data. Here, we look at how the latest approaches are cutting complexity and redefining what’s possible from peptide mapping workflows.

Enhanced manual proteolysis protocols using digestion kits

Proteolytic digestion plays a key role in peptide mapping. It is used to break up the complex architecture of protein therapeutics into bitesize fragments that can be used to piece together the overall structure. Trypsin is most commonly used for proteolytic digestion due to its high enzymatic specificity. However, despite being well-established in peptide mapping workflows, the in-solution trypsin digestion protocols commonly used for sample preparation are often labour intensive and prone to errors that can compromise reliability. With accurate characterisation fundamental to patient safety, especially in workflows that only employ ultraviolet (UV) detection without confirmation by MS, robust sample preparation and separation methods are essential. Digestion protocols must therefore be reproducible and separation steps must be stable to allow unambiguous peptide identification based on chromatographic retention time.

Given the importance of peptide digestion for many biotech workflows, equipment and reagent suppliers have responded by developing kits containing all the relevant solutions required for the robust and reliable digestion of biotherapeutics. These proteolysis kits are capable of providing rapid protein digestion with exceptional reproducibility and sensitivity to deliver high-quality characterisation data. While traditional in-solution digests could often take days to perform proteolysis, some of the latest protein digestion reagents, such as Thermo Fisher Scientific’s SMART Digest kit, typically take around 60 minutes to reach completion. When used in combination with the high measurement consistency offered by ultra-high performance liquid chromatography instruments, these workflows provide exceptional levels of measurement reproducibility.

In fact, the resilience of the latest systems even allows individuals with no prior experience of protein digestion techniques to achieve accurate and reliable results. Figure 1 highlights the peptide map obtained through the manual digestion of rituximab, a large monoclonal antibody. The digestions were performed by five individuals, several of whom had not conducted a protein digestion previously. In each case, 5 µL of digest solution was injected for LC–MS without further purification, and the peptides were separated using the gradient method shown in Table 1. The relative standard deviation (RSD) for the 20 peaks in the spectrum is shown in Table 2, and the average RSD across all 20 peaks was 2.74. These impressive findings highlight the measurement robustness that can be achieved using the latest workflow solutions.

Automated digestion protocols using magnetic beads

Novel digestion technologies are helping to further reduce the complexity of peptide mapping workflows by minimising the level of human involvement required. Magnetic beads are a proven support medium for many sample preparation and purification processes in life science research, and many of the latest automated systems use this technology to boost the productivity of high-throughput workflows.

Automated systems based on magnetic bead technology minimise the manual handling required for protein digestion and ensure that the reactions involved in these workflows are timed to perfection to reduce the possibility of post-translational modifications. As a result, these automated systems can reach levels of reproducibility that extend even beyond the high levels achieved using manual digestion kit protocols. Automated systems such as the Thermo Scientific KingFisher Duo Prime platform, for example, can achieve 1.5 times less variance in results compared to those obtained by manual digestion. This high level of performance can make a considerable difference when more complex proteins are involved.

Towards reliable peptide mapping, every time

To ensure innovative protein therapeutics are effective and safe to use, and reach patients in the shortest possible timeframe, the quality assurance protocols involved in their manufacture must be robust, reliable and efficient. The latest tools for peptide mapping are helping biotech companies optimise their manufacturing workflows by taking the complexity out of protein characterisation. Thanks to powerful proteolysis digestion kits and automated LC–MS systems, these modern techniques are accelerating output and delivering more accurate results. And because these resilient protocols can be operated by individuals of any experience level, human error is minimised and more consistent results can be achieved.

Back to topbutton