7,497
Views
31
CrossRef citations to date
0
Altmetric
White Paper

2016 White Paper on Recent Issues in Bioanalysis: Focus on Biomarker Assay Validation (BAV) (Part 1 – Small Molecules, Peptides and Small Molecule Biomarkers by LCMS)

, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , & show all
Pages 2363-2378 | Received 30 Sep 2016, Accepted 30 Sep 2016, Published online: 07 Oct 2016

Abstract

The 2016 10th Workshop on Recent Issues in Bioanalysis (10th WRIB) took place in Orlando, Florida with participation of close to 700 professionals from pharmaceutical/biopharmaceutical companies, biotechnology companies, contract research organizations, and regulatory agencies worldwide. WRIB was once again a 5-day, weeklong event – A Full Immersion Week of Bioanalysis including Biomarkers and Immunogenicity. As usual, it was specifically designed to facilitate sharing, reviewing, discussing and agreeing on approaches to address the most current issues of interest including both small and large molecule analysis involving LCMS, hybrid LBA/LCMS, and LBA approaches, with the focus on biomarkers and immunogenicity. This 2016 White Paper encompasses recommendations emerging from the extensive discussions held during the workshop, and is aimed to provide the bioanalytical community with key information and practical solutions on topics and issues addressed, in an effort to enable advances in scientific excellence, improved quality and better regulatory compliance. This white paper is published in 3 parts due to length. This part (Part 1) discusses the recommendations for small molecules, peptides and small molecule biomarkers by LCMS. Part 2 (Hybrid LBA/LCMS and regulatory inputs from major global health authorities) and Part 3 (large molecule bioanalysis using LBA, biomarkers and immunogenicity) will be published in the Bioanalysis journal, issue 23.

Introduction

This year marked the 10th anniversary edition of the Workshop on Recent Issues in Bioanalysis (10th WRIB), which was held in Orlando, Florida from April 18–22, 2016. Nearly 700 professionals from pharmaceutical/biopharmaceutical companies, biotechnology companies, contract research organizations (CROs), and regulatory agencies worldwide were in attendance. In addition to the 3 focused and sequential workshop days, there were 6 advanced training sessions throughout the week providing attendees multiple choices to combine core workshop days and training, thereby maximizing their learning process in many fields of drug development expertise. As with prior WRIBs, this 10th edition was specifically designed to facilitate sharing, reviewing, discussing and agreeing upon scientific and regulatory approaches to address the most recent issues in both small and large molecule bioanalysis, biomarkers, and immunogenicity. The areas of interest included LCMS, hybrid LBA/LCMS as well as LBA/cell-based approaches.

The chairs of the 2016 edition of the WRIB included Dr. Eric Yang (GlaxoSmithKline), Dr. Jan Welink (EMA), Dr. An Song (Genentech), Dr. Fabio Garofolo (Angelini Pharma), Dr. Susan Richards (Sanofi), Dr. Lakshmi Amaravadi (Sanofi) and Dr. Renuka Pillutla (Bristol-Myers Squibb).

As usual, a number of regulatory agency representatives contributed actively to the 10th WRIB, including Dr. Sam Haidar (US FDA), Dr. Nilufer Tampal (US FDA), Dr. John Kadavil (US FDA), Dr. Kara Scheibner (US FDA), Dr. João Pedras-Vasconcelos (US FDA), Dr. Jan Welink (EU EMA), Dr. Ronald Bauer (Austria AGES), Mr. Jason Wakelin-Smith (UK MHRA), Mr. Stephen Vinter (UK MHRA), Dr. Fabrizio Galliccia (Italy AIFA), Mr. Gustavo Mendes Lima Santos (Brazil ANVISA), Dr. Mark Bustard (Health Canada), Dr. Laurent Cocea (Health Canada), Dr. Akiko Ishii-Watabe (Japan MHLW-NIHS), Dr. Yoshiro Saito (Japan MHLW-NIHS) and Ms. Stephanie Croft (WHO).

The entire workshop was designed to complete the drafting of this white paper based on the daily working dinners and open panel discussions among the lecturers, regulators and attendees. Each core workshop day covered a wide-range of bioanalytical, biomarker and immunogenicity topics requested by members of the community, and included lectures from industry opinion leaders and regulatory representatives.

As with prior WRIB editions [Citation1–12], a significant number of topics were addressed during the workshop and condensed into a series of relevant recommendations. In this current White Paper, the exchanges, consensus and resulting recommendations on 32 recent issues (‘hot’ topics) in bioanalysis, biomarkers and immunogenicity are presented. These 32 topics are distributed across the following areas:

  • Small molecules, peptides and small molecule biomarkers by LCMS:

    • Method development challenges in bioanalysis (six topics);

    • Bioanalytical regulatory challenges (five topics);

  • Hybrid LBA/LCMS for biotherapeutics, biomarkers and immunogenicity:

    • Biomarker and immunogenicity assays (four topics);

    • PK assays (four topics);

  • Large molecules by LBA and cell-based assays:

    • Immunogenicity (five topics);

    • Biomarkers (four topics);

    • PK assays (four topics).

In addition to the recommendations on the aforementioned topics, an additional section has been provided in the current White Paper that specifically focuses on key inputs from regulatory agencies.

Due to its length, the 2016 edition of this comprehensive White Paper has been divided into three parts for editorial reasons. This publication (Part 1) covers the recommendations for Small Molecules, Peptides and Small Molecule Biomarkers by LCMS. Part 2 (Hybrid LBA/LCMS and Regulatory Agencies’ Inputs) and Part 3 (Large Molecule Bioanalysis using LBA/Cell-based Assays, Biomarkers and Immunogenicity) will be published in the Bioanalysis journal, issue 23.

Discussion topics

Method development challenges in bioanalysis

Main challenges & approaches for successful dvelopment and validation of small molecule endogenous biomarker bioanalytical assays using LCMS

What concerns exist when using a SIL analyte as a surrogate analyte for calibration curves (including stability determinations)? When using a SIL analyte, what are the options for MS operations and ratio usage (e.g., must you tune the MS to get equivalent response ratios)? Can endogenous QC samples pooled from the patient population as the study samples be used in place of ISR? Does HRMS detection offer any advantages for small molecule biomarker assays in compensating for surrogate matrices or analytes? Is the assumption that a parallelism experiment will be performed for small molecule biomarkers correct?

Focus on cutting-edge microsampling techniques & their impact on method development strategies

Should VAMS be included in the ICH S3A for comparison of microsampling methods across species and test materials? How can you overcome the issues of performing sample repeats and ISR/ISS with the use of VAMS in a regulated environment? For small rodents, multiple sample collection could be an issue due to over bleeding. What are the expectations for sample analysis repeats? ICH S3A mentions a sample volume of 25–30 μL; should it be decreased since VAMS allows for collection of 10 μL samples? Actual samples are directly absorbed on VAMS, so “fresh blood” (i.e. used just after the collection) should be used for the validation. Should blood aging and frozen blood effect be studied during method development and validation to overcome the availability issues of “fresh blood”? If aging of fresh blood (or frozen blood) doesn’t impact the results, can its use be accepted for validation and QC preparation?

Overcoming complex/cyclic small peptide method development issues: low CID fragmentation efficiency & metabolism/ADME

What are some challenges for the LCMS user when adopting HRMS for peptide analysis? What are the major method development bottlenecks for complex cyclic peptides? Does survivor scan work well for other molecules besides cyclic peptides and can poor fragmentation be predicted?

Bioanalysis of samples dosed with complex formulations & impact on method development strategy

Is it acceptable to deliver total drug concentrations only? Is there an interest in knowing what is “free” and “encapsulated”? What level of validation is required for each end point (total versus free versus encapsulated)? What is considered the primary endpoint? What is considered informative? Can we apply scientific validation? Is an indirect (e.g., metabolite) measurement acting as a surrogate for drug released from the formulation acceptable? Is stability of the compound in the presence of the formulation needed? For technologies where the active drug substance is bound to a carrier, how much understanding do we need to have of the loading of the carrier in vivo over time? What interest do regulators have in tracing the fate of the carrier itself?

Investigations on method development problems & solutions when dealing with special/patient population study samples

In studies in patients where the number and type of concomitant medications may be highly variable, how does one ensure assay specificity? Do stable isotope labeled internal standards always compensate for special population issues that impact LCMS based bioanalytical methods? If not, what are the exceptions?

Effect of sample collection on method development & validation

The US FDA’s message is: “…stability evaluations should cover the expected sample handling… including conditions at the clinical site.” The EMA’s message is: “…attention should be paid to the stability of the analyte in the sampled matrix directly after blood sampling…” How realistic is it to maintain the integrity of samples for potential future use for biomarker analysis, especially for early development programs since many factors, such as tissue type, time of collection, containers used, preservatives and other additives, transport means and length of transit time, affect the integrity of the samples and the stability of biomarkers and must be considered at the initial collection stage? Regarding risks and benefits of microsampling collection procedures, is the potential addition of internal standard at the time of collection an option? What about the dilution of samples at the collection site for potential high concentration samples when there is a stability concern at high concentration? Are there regulatory insights in handling high concentration samples with respect to long term stability, freeze-thaw and other relevant validation parameters? Is there any impact of sample collection on bioanalytical data due to blood cell association, clotting, effect of differences in refrigerated centrifuges in different clinical sites, or delays in blood sample processing? If yes, how can they be overcome? Should the procedure for sample processing be taken into consideration during method development and validation? If yes, with what kind of experiments?

Bioanalytical regulatory challenges

In-depth evaluation of small molecule biomarker validation & acceptance criteria: biomarker assays vs. endogenous analyte PK assays: are they the same?

What is the appropriateness of ISR for biomarker studies or would ISS be a more valuable experiment to perform? Comparing LCMS with other biomarker assay technologies can be problematic as the methods often use different matrices, dynamic ranges and reference standards. Is there a need to perform comparison of biomarker data between LCMS and other “gold standard” methods? What correlation experiments should be performed? What if the results are different? What is the best practice when establishing a biomarker assay “endogenous QC” acceptance range? How and when should the range be readjusted? Is it possible to use the BMV PK assay criteria for small molecule biomarkers if the reference standard and isotopically labelled material are available? Should the use of matrix pools from both patient and healthy subjects always be included in small molecule biomarker validation? Why?

Industry best practice on processed batch acceptance; electronic data management & integrity; extract stability; & GCP clinical sample bioanalysis

What should be considered for the evaluation of processed batch acceptance criteria in addition to whole run acceptance criteria? Is it possible to define the criteria for processed batch acceptance (QCs, calibration standards, and matrix blanks)? What is the definition of a batch? Is there any feedback on the application of the industry/regulator recommendations from the 2014 and 2015 White Papers in Bioanalysis on Electronic Data Management? Are these recommendations still valid? What is the current regulatory/industry experience with ELN from inspections (data governance system, data integrity throughout the entire data lifecycle, audit trail, data archiving)? What has been the ELN’s impact on management review and compliance improvement? What is the best practice for experimental design to evaluate extract stability? If extract stability fails to meet acceptance criteria, but reinjection reproducibility is acceptable (in cases where samples are not extracted separately from the standards), are the data impacted? Is the extract stability assessment intended to demonstrate that this phase of handling/storage does not impact the measured results or to assess true stability of the analyte in the extracted matrix? How much responsibility for the overall GCP study conduct should be placed on the bioanalytical laboratory? For non-EU submissions, are bioanalytical laboratories expected to adhere to the principles detailed in the EMA reflection paper? Maintaining compliance with human biological sample laws that differ from country to country is very complex. Are there efforts to standardize human biological sample management rules globally? MHRA is about to release a new guideline on GLP/GCP data integrity – why the heightened focus?

Regulatory impact of uncovering previously undetected metabolites using modern bioanalytical tools

What is the current industry consensus when a bioanalytical method developed on an HRMS instrument is applied in later phase clinical trials? Are methods expected to be redeveloped and validated using “standard” targeted MRM methodology? What are the real concerns with HRMS technology being used retrospectively in the generation of post-acquisition qualitative information? What can the industry do to alleviate these concerns? What is the impact of incomplete metabolism studies of commercial old drugs on PK profile and ADME (i.e., identification of new active/isobaric metabolites, accurate quantification of extremely unstable metabolites, and detection of new toxic metabolites)?

HRMS for small molecule regulatory submissions: a step by step robust BMV protocol for HRMS

Are similar assay validation approaches/procedures adequate between conventional LCMS and HRMS instruments? What are the advantages of using HRMS to verify biomarker selectivity? Are there other approaches used? Can HRMS be part of the standard bioanalytical workflow? Are there any new updates on HRMS BMV recommendations from the 2014 and 2015 White Papers in Bioanalysis on Extraction Windows or Instrument Calibration? Are these recommendations still valid? Has there been any new industry/regulator experience with the submission of studies using HRMS?

Inappropriate use of LCMS integration parameters & influence on data reliability

What parameters should be fixed in validation? What changes are permissible? Is there room for manual integration? What other controls are needed (e.g., documented processes, data reviews)? How do we reconcile the scientific need for adjustment of integration parameters with regulatory concerns?

Discussions, consensus & conclusions

Method development challenges in bioanalysis

Main challenges & approaches for successful development & validation of small molecule endogenous biomarker bioanalytical assays using LCMS

Biomarkers are considered essential tools for drug discovery and development as they can provide an indication of the activity of a drug on the target or disease. Additionally, biomarkers are used to assess study inclusion or exclusion criteria to ensure benefits to patients entering into treatment. Historically, immunoassays of proteins and small chemical moieties have been the main approaches for measuring biomarkers. Initially, LCMS was not prevalent in biomarker determination because many clinical chemistry laboratories did not have the technology. However, with an increased emphasis on biomarkers in early drug development and the improvement of sensitivity and specificity of LCMS technologies, the use of mass spectrometry in biomarker quantitation has increased. HRMS may especially offer an advantage for small molecule biomarker analysis when assay selectivity is of particular concern.

One common challenge in the development of a biomarker assay relates to the presence of endogenous concentrations of the analyte in the matrix of interest. If substantial analyte concentrations are present in that matrix, alternative approaches in the preparation of calibration standards and QC samples must be undertaken. Because LCMS has the ability to differentiate analytes by molecular weight, one solution to this problem is the use of a SIL analyte as a surrogate analyte for calibration curve preparation (and QC sample preparation as applicable). This approach can be preferred over the surrogate matrix approach, since it eliminates concerns about the effect of a surrogate matrix on assay performance when the surrogate matrix does not adequately mimic the matrix of interest. The utility of this approach must be properly evaluated based on the biology of the biomarker and the availability of the reference materials [Citation13], and has previously been demonstrated to be comparable to the use of a surrogate matrix [Citation14]. The SIL analyte approach involves analysis of three forms of the analyte: 1) the native, unlabeled biomarker; 2) a stable isotope labeled form of the biomarker for the standards, and 3) an additional, differently labeled form to use as an internal standard. When selecting deuterium labeled forms, the lack of isotopic exchange must be demonstrated. It was agreed that the use of deuterium labelling should be limited since it can cause chromatographic separation effects. It is recommended to use 15N or 13C labeled analytes if the introduction of these isotopes is synthetically possible. It is possible that the slope of the SIL and the unlabeled analyte might be different, as might be the response ratio in neat solution compared with the response ratio in matrix. In order to mitigate the impact of these differences, the mass spectrometer can be tuned until equivalent response ratios are obtained. However, equivalency may not be required with adequate method performance provided that the ratios be assessed for consistency within the assay between different days and different instruments.

As with any surrogate assay, it is recommended to prepare and test QC samples containing the native, unlabeled biomarker in natural unaltered matrix. This can be challenging depending on the availability of quality reference material and sources of matrix containing various concentrations of the analyte. Whatever the approach chosen to prepare and measure the QCs, the use of endogenous analyte, surrogate matrix for dilution, and spiking of synthetic reference standards, the implications of each approach must be taken into account when designing the validation experiments. Regardless of which approach is used, surrogate matrix or SIL analyte, a parallelism experiment for small molecule biomarkers is recommended [Citation15].

It was discussed if data from these endogenous QC samples could be used in place of additional ISR experiments. The consensus was that analyzing the pooled incurred samples as endogenous QCs should provide a good understanding of data reproducibility between days. This may be sufficient for the ISR needs during the early stages of drug development. However, formal ISR evaluation as used in PK analysis may need to be performed at a later stage of the drug’s development. Health authorities have not stated a position on endogenous QCs replacing or supplementing ISR information.

Current standards for determining small molecule biomarkers by LCMS vastly differ between companies, covering a wide range of method development and validation elements from discovery to following CLIA clinical chemistry laboratory standards. Central to any biomarker assay is the understanding of its biology to establish appropriate performance characteristics and acceptance criteria of the assay. For example, if a 200% change in the level of the biomarker is expected, a less accurate and precise assay may be sufficient than when a 20% change is expected. Adequate understanding is important for method development and experimental design to demonstrate that the assay is able to answer the questions posed by the study.

Focus on cutting-edge microsampling techniques & their impact on method development strategies

Microsampling is an important technique for preclinical and pediatric studies. It aligns with the 3Rs strategy (Replacement, Reduction, Refinement) in non-clinical studies, producing better quality data as it allows sampling at multiple time points from the same animal. Additionally, it is more ethical when involving children in clinical trials. A novel approach, known as VAMS, has been developed to deliver a dry matrix sample for bioanalysis. When using VAMS, the device absorbs an accurate, fixed volume of blood (10 μL or 20 μL); this has been demonstrated to essentially eliminate volumetric hematocrit effects, one of the major drawbacks of DBS. Blood is directly absorbed on the VAMS device. Standards and quality control samples are prepared by spiking whole blood with the analyte, which is then absorbed on the VAMS and dried at room temperature. VAMS devices are inserted into a 96-well plate for extraction with an appropriate solvent.

Consensus was reached on the suitability of VAMS in a discovery setting according to the 3Rs strategy. The procedure can be easily automated with good recovery. No significant matrix effect or aging effect of samples collected via VAMS was observed. It was agreed that, in order to mimic study samples, validation of these methods should use fresh blood, although it would be useful if frozen blood or preserved blood could be shown to be valid for use as a control matrix.

Recommendations in literature and guidance documents on microsampling are available [Citation7,Citation10,Citation16], although VAMS is not mentioned specifically. It was proposed that future revisions of the ICH S3A Q&A would describe various microsampling methods as examples so that the VAMS and other emerging useful microsampling devices will be widely recognized. Consideration should also focus on recommended sample volumes. The recent draft Q&A reached a consensus on volumes equal to or less than 50 μL [Citation16]. However, VAMS can use volumes as low as 10 μL in mouse or pediatric studies (10 μL is only for one single extraction and analysis; if including repeat analysis in duplicate and ISR, there will be a need at least for 4 tips for a single time point/subject). With advances in technology, sample volume should be driven by the needs of the study and the quality of the data that can be used with the microsampling device and assay.

One additional limitation of using VAMS in a regulated environment is the difficulty of performing sample repeats and ISR. This is indeed a well-known limitation of DBS and other microsampling techniques, since, if multiple samples are collected at the same time point, the minor differences in collection time can lead to differences in determined analyte concentration. It was suggested to mix the blood before absorption on VAMS as in the procedure described by P. Dennif et al. [Citation17] or to perform ISR only once per species to prove data reproducibility.

Overcoming complex/cyclic small peptide method development issues: low CID fragmentation efficiency & metabolism/ADME

Due to observed CID fragmentation inefficiency, developing sensitive LCMS assays for CID resistant compounds is especially challenging. As an alternative to traditional LCMS, a methodology was presented that preserves the intact analyte ion for quantification by selectively filtering ions while reducing chemical noise. Utilizing a quadrupole-Orbitrap MS, the target ion is selectively isolated while interfering matrix components undergo fragmentation by CID, allowing detection of the analyte’s surviving molecular ion with a much improved signal-to-noise ratio. In this manner, CID affords additional selectivity during high resolution accurate mass analysis by elimination of isobaric interferences, a fundamentally different concept than the traditional approach of monitoring a target analyte’s unique fragment following CID. It is suggested that survivor scans may also be useful for molecules besides cyclic peptides, such as oligonucleotides, steroids, opiates or similar compounds that don’t fragment well. However, it was agreed that there are some challenges for the triple quadrupole MS user when adopting HRMS for peptides analysis. The software may not be optimized for this type of application, and the technical skills needed to optimize the instrument may be lacking. In addition to these issues with the detection, sample preparation and non-specific binding can also be major method development challenges for complex cyclic peptides. This technology is largely used in discovery [Citation18]; no examples of this approach for regulated bioanalysis were described.

Bioanalysis of samples dosed with complex formulations & impact on method development strategy

Developing strategies for targeted drug delivery are now an integral part of drug development. Novel drug delivery systems are being investigated as possible approaches to improve the pharmaceutical properties of drug candidates. Using innovative vectors, such as nanoparticles, may improve a drug candidate’s therapeutic index by tailoring the DMPK properties and/or targeting organ localisation. Each different delivery vehicle presents the bioanalytical scientist with a unique set of challenges and often there is a requirement not only to understand the drug substance exposure, but also to understand the relationship between the drug and its carrier. Consequently, there has been a greater emphasis on early bioanalytical studies for a better understanding of PK/PD on how these formulations perform in vivo. It was agreed that there are analytical challenges for the bioanalyst to measure drug exposure following nanoparticle delivery in the non-clinical setting. These include a lack of understanding of how to apply regulatory guidance acceptance criteria to each analytical endpoint, nanoparticle instability (nanoparticles are designed to leak, and sample handling must be optimized to minimize diffusion), impact of each component on assay dynamic ranges and potential interferences caused by drug stability and the impact of metabolites on assay performance. The following strategies may be considered to differentiate the encapsulated from the released drug plasma concentration following nanoparticle drug dosing:

  • The preferred option is solid phase extraction (SPE), which is easy to perform and automate, and has a good sample throughput. However, when using SPE, carefully consider at which stage to add the IS depending on whether encapsulated or free drug is being extracted.

  • Size exclusion techniques may also be considered, but can be time consuming and tend to have reduced recoveries due to drug interactions with the stationary phase.

  • Other approaches, such as ultracentrifugation, can also be time consuming.

  • An alternative bioanalytical strategy is to use a surrogate measurement for the released drug, such as a metabolite.

The qualification of the methods typically uses a tiered approach for early and exploratory studies. Assays for total, encapsulated, and released drugs minimally require selectivity, matrix effects, one precision and accuracy batch and one freeze thaw stability cycle. Released drug assays also require a precision and accuracy experiment using QC samples that contain the nanoparticle along with the compound of interest. Later stage development studies should be supported by fully validated methods.

It was agreed that the necessity to develop assays for the measurement of free and encapsulated drug depends on what specific project questions need to be answered. It is suggested that, in the situation whereby a robust method to measure free drug may not be possible, then the use of an indirect measurement (e.g., metabolite) may act as a surrogate for drug released from the formulation depending on the metabolic pathway. However, to use this approach, a good understanding of the relevant drug concentration for PK/PD modeling is seen as critical to underpin the approach and it should be clearly justified. It was advised that a discussion of the study objectives with the regulators during planning should be considered to ensure the correct approach is chosen. The requirement for understanding the stability of the drug in the presence of the nanoparticle formulation was discussed. It is recommended that the formulation stability assessments in vitro and in vivo may be expanded to include these experiments. Regulators indicated that it may be necessary to trace the fate of the carrier itself, since it could be a safety concern. The need to potentially quantify the carrier would depend upon factors including the type or novelty of the carrier.

Investigations on method development problems & solutions when dealing with special/patient population study samples

Method development activities and method validation are typically conducted with control matrices from normal, healthy volunteers receiving no other medications. Although matrix variability is addressed to some degree by utilizing multiple lots of matrix during these activities, these multiple lots are also generally from healthy subjects. However, the bulk of clinical studies take place in patients dissimilar to those from which control matrix is obtained. Samples obtained from patients may have properties different from those from healthy volunteers. As a result, the performance of bioanalytical methods occasionally varies from that observed during validation once the assay is applied to samples from patients. Bioanalysts need to ensure consistent assay performance regardless of sample origin. It was agreed that there are several factors that affect assay performance in such subjects. With renally impaired patients, higher levels of circulating endogenous matrix components are generally present, contributing to selectivity issues or matrix effects [Citation19]. It is recommended that specific sample preparation approaches such as LLE or SPE should be used instead of protein precipitation in case a SIL IS is not available. It is also recommended that matrix from a representative renally impaired subject be included during method development and validation if the assay is intended for this purpose.

Disease state can also affect assay performance by suppressing the signal of either the drug, the IS, or both [Citation20]. Long term use of biologic therapeutics may result in the formation of ADA in patients. These ADAs may impact both ligand binding and LCMS assays that utilize immunoaffinity extraction [Citation21]. Circulating targets of a biologic drug may be present at markedly increased concentrations in patients, which may interfere with assays that utilize target-capture as the basis of analyte isolation or analyte quantitation. Matrix from patients should be assessed during method development or validation, and if appropriate antibody and target standards are available, the effect of their addition should also be assessed.

In the non-clinical studies, the protein and fat content of maternal matrix from reproducing animals may lead to ionization effects or premature chromatography column failure, which impacts assay ruggedness. It may be necessary to adjust sample preparation steps to remove extra matrix components, and their impact should be determined pre-study. In the clinical studies, the impact of different diets may also require consideration as proteins and fats in the matrix sampled at different times in relation to food or drink consumption could result in variability in the assay’s ruggedness.

Finally, depending on the disease state, patients may be taking other medications in addition to the study drug. Despite the selectivity advantages, LCMS assays are not impervious to interference from concomitant medications. Screening of common medications during assay development and validation, either experimentally or by a theoretical assessment of molecular weights and HPLC retention characteristics is recommended. Furthermore, pre-dose samples from patients in studies should always be requested. In cases where the number and type of concomitant medications may vary widely, it is important to pay special attention to anomalous IS responses or positive matrix signals and investigate if IS variability is present. Typically, SIL IS compensates for matrix effects issues, however, in special populations, SIL IS may not fully correct for the matrix components which might be present in samples from these patients. It is recommended to steer clear of heavily deuterated IS, as such molecules may be chromatographically resolved from the analyte and may not sufficiently compensate for matrix effects.

Effect of sample collection on method development & validation

In bioanalysis, the measured analyte concentration should reflect the concentration at the time of sample collection. When instability or nonspecific binding has been demonstrated or particular portions of the sample process procedures are demonstrated to have specific limits, caution needs to be exercised during collection, processing, shipment and storage of biological samples to prevent over or under-estimation of the reported result [Citation22–24]. It is important to evaluate the stability and storage conditions during method development to demonstrate optimization of collection procedures. Processing conditions also need to be considered during method development, but may not need to be included in formal validation experiments, providing adequate documentation is available from assay development. Both the FDA and EMA BMV guidance documents require a demonstration of proof that sample concentrations obtained after processing and storage reflect the sample concentrations directly after collection [Citation25,Citation26]. Simple and practical approaches should be considered for sample stabilization at the clinical site, if possible. There is a number of commonly occurring stability issues, all typically irreversible, such as hydrolysis, deamination, or metabolic conversions. The addition of stabilizers can be an effective way to mitigate these issues, provided that bioanalysts properly understand the physicochemical properties of the analyte and the inherent conditions which affect the stability. It is important that volume differences are taken into account if stabilizers are added to the collection tube, and hazardous stabilizers such as concentrated acid and bases should be avoided.

The collection technique used in the clinic also plays a very important role in protecting the integrity of the sample from post-collection hemolysis. Any potentially impactful condition for sample collection such as blood cell association/equilibration, clotting, effect of differences in refrigerated centrifuges in different clinical sites, and delays in blood sample processing should be evaluated during development and mitigated by sample processing instructions. Such early due diligence typically prompts the suitable selection of the matrix and sample collection conditions. It is essential that clear communication between the bioanalytical laboratory and the clinical site is established with unambiguous instructions in the sample collection manual, especially in instances where the collection procedure is different from standard routine procedure. Microsampling collection procedures are particularly useful for non-clinical, pediatric, elderly and critically ill patient studies; however detailed procedures and training must be in place. It may be necessary to pre-dilute microsamples. Instability of the sample in the capillary or the collection device may also be an issue. The use of these procedures should be appropriately evaluated during method development and subsequently validated. If additional steps are required (e.g., addition of IS, washing of the capillary), attention should be paid to ensure that the staff executing these steps have the proper skills and training. It may be prudent to execute these steps in the bioanalysis laboratory or in a controlled environment, or to provide clinical sites with prepared tubes containing accurate volumes of additives. Regulators recommended that in the cases where dilutions of high concentration samples may be needed, long-term stability, freeze-thaw stability and the evaluation of other validation parameters at these high concentrations should be determined on a case-by-case basis based on appropriateness and good science considering the purpose of the study and the impact. It was agreed that investigations on stability, non-specific binding, or storage become exceptionally important when measuring concentration levels of biomarkers, endogenous components, and biologics. Many factors, such as matrix type, time of collection, containers used, preservatives and other additives, transport means and length of transit time, may affect the quality of the samples and the stability of biomarkers and must be considered at the initial collection stage. It is recommended to have a detailed history of how the sample was handled to ensure that the quality of samples is maintained for potential future use for biomarker analysis, especially for early development programs.

Bioanalytical regulatory challenges

In-depth evaluation of small molecule biomarker validation & acceptance criteria: biomarker assays vs. endogenous analyte PK assays: are they the same?

There are a number of publications that provide guidance on how endogenous analyte validations should be approached and the level of validation that is required. The degree of validation is influenced by the purpose of the endogenous analyte data, either assessing exploratory and safety biomarkers or determining pharmacokinetics of an endogenous therapy. Endogenous PK sample validation is largely guided by the FDA, EMA and MHLW BMV guidelines [Citation25–27]. Biomarker assays, however, are considered out of scope of these guidance documents. The 2013 draft FDA BMV guidance [Citation28] does address both endogenous compounds and biomarkers. Additionally, since the release of the draft FDA guidance, discussions between industry and the FDA have been ongoing [Citation29]. There is guidance available in the literature for recommendations pertaining specifically to validation of biomarker methods [Citation30–32]. It is recommended that if the small molecule biomarker reference standard material and isotopically labelled material are available, then standard BMV PK assay criteria may be applied. Historically, the “gold standard” biomarker assays in biological fluids were considered to be LBA assays. However, as technologies evolve, and LCMS is being used more often for small molecule biomarkers, a gold standard is becoming more difficult to define. It is now possible to base assay selection decisions on good science rather than simply on the availability of the technology. What is important is that the decision must be defensible, and that, if multiple platforms are used over drug development, bridging studies should be performed as required. Regardless of the technology used, an endogenous QC should be included during method validation in order to demonstrate the suitability of the method. The acceptance criteria of this endogenous QC should be fit-for-purpose and based on the biology and variability of the biomarker. The approach will be data driven, and determined from acquired validation data. Whenever possible, matrix pools from both healthy and patient populations should always be included in the validation. Exceptions to this may be acceptable if appropriate supporting data is available. The appropriateness of performing an ISR or/and an ISS for biomarker studies was discussed. It was agreed that the need for such assessment is dependent upon the anticipated use of the results. For example, if the results are a pivotal end-point, ISR may be necessary.

Industry best practice on processed batch acceptance; electronic data management and integrity; extract stability; & GCP clinical sample bioanalysis

Processed batch acceptance is evaluated with respect to the draft FDA and EMA BMV guidance documents [Citation26,Citation28] and the opinions of industry to include whole run (analytical run) acceptance and processed batch acceptance [Citation33]. The topic has been raised in several meetings due to the concerns over issues within one batch impacting accuracy and precision that may not be caught when relying on total run acceptance. The premise is that multiple batches, processed separately (e.g., by different chemists) but analyzed together, may be evaluated together as one analytical run for the purposes of accepting study samples. The sample preparation process should therefore be closely examined when defining a processed batch. This may be straightforward when, for example, several multiple-well plates are used, but defining a processed batch may be challenging for manual assays that use other pieces of equipment that result in sequential processing of samples (e.g., Turbovaps, centrifuges, manifolds). Although all equipment should be calibrated and maintained to ensure that there are negligible differences between each processing action, preparations that involve several handoffs with samples being batched into smaller groups should be taken into account. The terms ‘batch’ and ‘run’ appeared to be used interchangeably and this stressed the importance of defining a batch or run in a SOP (or, where applicable, in a study plan). For example, the number of blanks, calibration standards, QC samples and subject samples processed (extracted and analyzed) concurrently after the addition of internal standard, should be considered when defining a batch. The maximum number of samples to be processed in a batch or run during a study should also be evaluated during validation and thereby establish the batch size. There should also be a sufficient number of QC samples, inter-dispersed throughout the study samples, to ensure adequate coverage to support the data through critical processing steps. Any criteria set for the run should consider the location of QCs and their ability to detect a failure during processing steps (e.g., if a run consists of 2 plates processed concurrently and all QCs on one plate fails, then the run fails).

Over the last decade, the bioanalytical community has been evolving towards the use of ELN because of their potential in improving process efficiency and compliance. However, the introduction of ELN in bioanalytical laboratories has been slower for various reasons, such as the lack of purpose-built bioanalytical ELN and uncertainties regarding how regulators will audit data [Citation34,Citation35]. The discussions at the 10th WRIB were a continuation of previous discussions in 2014 [Citation8] and 2015 [Citation11]. Recommendations from these previous conferences should still be considered. One key element is the need for data integrity compliance. Recent data integrity guidance documents [Citation36,Citation37] provide examples of current regulatory thinking on the topic, and apply not only to traditional data collection methods, but electronic records as well. It is unclear at this point of the impact of ELN on management and compliance review. Companies and regulatory agencies must still gain experience with ELN in regulated laboratories and during inspections. Regulatory experience with ELN is discussed in Part 2 of this White Paper [Citation38].

Extract stability testing has been widely debated among industry and regulatory experts, including as part of past WRIB meetings [Citation1,Citation7]. FDA and EMA BMV guidelines currently require the evaluation of extract stability in order to demonstrate that the analyte is stable in the extracted matrix environment through storage and analysis conditions [Citation25,Citation26]. Based on regulator feedback, however, the design of the required experiment varies considerably. It was agreed during this year’s discussions that best practice dictates stored low and high QC samples should be measured against a fresh calibration curve, and analyte/IS ratio should be used for the assessment. Several attendees also demonstrate reinjection reproducibility, however this experiment tests a separate hypothesis from the extract stability evaluation. If extract stability fails to meet acceptance criteria, but reinjection reproducibility (which involves re-injection of all samples that were processed together) passes, an investigation may be necessary to understand the cause of the issue (e.g., degradation, solvent evaporation). In such cases, an alternative experimental design to determine extract stability may be considered acceptable. This can be the case for methods with complex processing steps. It was stressed that due diligence regarding all rejected data is expected.

EMA released a reflection paper in 2012 [Citation39] discussing the analysis of bioanalytical samples in clinical studies. Traditionally, industry has attempted to follow the GLP regulations for these types of samples, even though they do not fall within the scope of the regulations. This is largely due to the fact that laboratory concerns are not directly addressed by the GCP guidelines [Citation40], despite falling in scope. EU regulations place the responsibility of the bioanalytical laboratory on the sponsor. They are required to have a proper awareness and oversight for the interface with the rest of the study team. For non-EU submissions, it is recommended to adhere to the highest regulatory standard required in the company’s jurisdiction. Although there is no current expectation for adherence to the GCP for clinical laboratories guidance, many in the industry do use this document when building their quality systems.

Regulatory impact of uncovering previously undetected metabolites using modern bioanalytical tools

The study of biotransformation is an essential part of the drug development process, providing valuable information on drug metabolism enzymatic pathways, the pharmacokinetics of the parent drug and its primary metabolites, the identification of chemically reactive metabolites from which there is potential for drug-drug interactions, genetic polymorphisms and other unwarranted effects such as drug induced toxicity. Before a drug receives regulatory and market approval it has to be studied in non-clinical and clinical studies to demonstrate its safety and efficacy. Quantification of metabolites has also gained particular attention since the publication of the MIST guidelines [Citation41], where preclinical safety testing is now also required for unique human metabolites and human metabolites formed in disproportionately high levels in humans. It was agreed that safety issues with new drugs and their major metabolites are not always detected during clinical trials and may take many years to emerge. There are examples of drugs that were widely used and then withdrawn from the market due to safety concerns that outweigh their pharmacological benefits. A number of these adverse effects were subsequently found to be associated with toxic/reactive metabolites that were unknown or poorly evaluated at the time of the original market approval. Traditional approaches to metabolite identification studies have involved the isolation and purification of metabolites from biological matrices in vitro (e.g., recombinant enzymes, subcellular fractions to organ systems) and in vivo (animal species and humans) which are then characterized using conventional spectral analysis (UV-, fluorescence-, electrochemical-detection and NMR) and compared to authentic reference standards. Alternatively, predicted metabolites can be synthesized and their identity confirmed by a variety of analytical techniques including TLC, UV spectrometry, GC-MS, radio-chromatography using LC and offline liquid scintillation counting. A major advance in metabolite identification studies has been the use of mass spectrometry with the improvements in high sensitivity, high resolution and accurate mass capabilities as evidence with the sequential application of single stage quadrupole instruments, triple quadrupole instruments, ion traps, time-of-flight (TOF), hybrid ion traps (Q-Traps), Hybrid TOFs (Q-TOFs), Fourier transform ion cyclotron resonance (FT-ICR) and Orbitrap instruments. In addition to instrument-related tools, a great deal of emphasis has also been paid to in silico tools for metabolite identification including: (i) computational software that uses knowledge-based tools to predict sites of metabolism; (ii) bioinformatic/cheminformatic databases that combines detailed data from the literature on drug (chemical properties and biotransformation) data with comprehensive drug target (i.e. sequence, structure, and pathway) (e.g., Drug Bank) and (iii) integrated LCMS data acquisition and post-acquisition data mining software where full scan MS and product ion spectral data sets can be examined for profiling of all components including metabolites present in biological samples [Citation42,Citation43]. In recent years, the application of HRMS instruments has increased, including the accurate mass determination of “all ions” in a biological matrix. This has resulted in a paradigm shift in quantitative bioanalysis and metabolite identification providing both quantitative and qualitative information on a parent drug and potential metabolites in a single run. This has great utility in early phase clinical trial drug metabolism allowing for a more comprehensive insight into the complete biotransformation pathway of a drug and identification of potential inter-individual differences. It can be also used in later phase clinical trials for investigative studies when there appear to be safety issues with marketed drugs and there is no drug related explanation. The current industry consensus when a bioanalytical method developed on a HRMS instrument is applied in later phase clinical trials is that there is no expectation for the method to be redeveloped and validated using “standard” targeted MRM methodology. However, if there is a need to move to another platform, it will need to be validated. The application of HRMS post-acquisition data mining tools are increasing at a dramatic rate as they allow a scientist to go back to analytical data months or years after the initial chromatographic data collection and obtain quantitative and qualitative information on a compound that was “unknown” at the time of the original analysis. This application has its highest value in early phase clinical studies where metabolism pathways may not have been fully elucidated. Other potential advantages include the investigation of “unusual” internal standard responses or apparent PK outliers (abnormal results), where post-acquisition data mining may reveal metabolites or a co-eluting interference unique to that subject/sample, which may be an attractive feature from a regulatory compliance perspective. The “capture all” approach does have its disadvantages for absolute quantitation, as the stability of the metabolite should be considered and may result in false negatives or false positives if a metabolite chemically degrades. Suppression could also be an issue because the LCMS conditions are likely to be optimized for the original analytes. The main concern from a regulatory perspective is that post-acquisition data mining is not something that was originally planned or validated. There is agreement that it could be useful as an exploratory tool, but if used for later phase studies, would still be subject to inspection. Another regulatory concern is patient consent and, in order to conduct post-acquisition data mining on clinical samples, it is essential that a system is in place that has broad enough coverage for informed consent in compliance with ICH guidelines. Should new data be discovered, it should be disclosed and discussed with regulators as relevant. In the next 5–10 years LC-HRMS will continue to be the first choice in metabolite profiling and provide significant advantages in discovery and early development.

HRMS for small molecule regulatory submissions: a step by step robust BMV protocol for HRMS

Triple quadrupole mass spectrometers continue to be the gold standard instrument for small molecule quantitation, offering excellent selectivity and sensitivity with robust, reproducible data delivery. Historically, HRMS has been applied for biopharmaceutical analysis; however the increased selectivity and linearity of these instruments are allowing this platform to be more routinely applied to small molecule quantitation [Citation44]. Building on recommendations previously published [Citation10], further examples were presented where HRMS approaches have been utilized to overcome selectivity challenges where triple quadrupole platforms fail to provide the required mass resolution [Citation45]. Building on these examples for small molecule support, a case study was discussed that exploited the increased selectivity of HRMS to biomarker support. This technology was successfully applied to interrogate the interference, and then develop a suitable method to avoid the interference. An additional application of using HRMS is utilizing the technology to verify and confirm biomarker selectivity for a given method and then moving forward to the triple quadrupole once confidence that the correct analyte is being measured is attained.

HRMS workflows and protocols have been implemented to develop and validate bioanalytical methodologies successfully. These are based on industry and regulatory accepted approaches from triple quadrupole instrumentation, since similar validation approaches can be used for both, with some additional validation experiments being required for HRMS (e.g., extraction windows). Recommendations regarding method validation using HRMS provided in previous White Papers are confirmed [Citation7,Citation10] and are still valid. Further experience with the technology is required in order to obtain regulatory feedback.

Inappropriate use of LCMS integration parameters & influence on data reliability

Integration of LCMS chromatographic peaks is a critical step in a bioanalytical method which transforms raw ion-count data into peak area or height values that are then used to generate calibration curves and concentration values for study samples. Computerized systems apply a number of parameters to detect, define the peak and baseline, and provide an area or height measurement for an individual peak. In order to deal with a wide range of analytes, methods, sample types and chromatographic conditions, LCMS data systems typically offer multiple integration options. These allow for setting and adjustment of various integration parameters such as smoothing, noise threshold and bunching factor, plus the option to manually define the baseline for an individual peak. Appropriate selection of these parameters is not only important during method development, but some adjustments may be needed during routine use of assays to allow for minor day-to-day fluctuations in instrument performance. For most peaks, changing integration parameters should not result in major changes in calculated concentrations. However, the potential for abuse exists when results are close to pre-set specifications and minor changes to a few integration parameters could alter concentration values sufficiently to change batch from failure to acceptance. Manipulation of integration parameters is, therefore, of particular concern to regulatory authorities. The potential for abuse or unintended bias falls primarily into two areas: 1) manipulating calibration or QC sample results to meet batch acceptance or validation criteria when the assay has actually failed, and 2) adjusting study sample results to meet a desired pharmacokinetic outcome, such as bioequivalence. Some current regulatory guidance documents refer to reintegration [Citation25,Citation26], but there are only limited comments and there is no firm definition of the meaning of this term. An additional problem is that changes to the integration parameters may not be initially obvious and require careful review of software audit trails to detect them. While concerns about the potential for abuse are real, these must be balanced with a clear understanding that appropriate integration and correct baseline settings are critical for accurate measurements. Even the most sophisticated automated integration algorithms may sometimes be challenged by bioanalytical methods measuring very low concentrations of analytes in complex matrices, particularly as methods approach the limits of instrument performance to fully define PK profiles. Some recommendations provided by industry exist in literature [Citation46], and were discussed. It was agreed that parameters which impact the way data is acquired should be fixed in validation. Permissible changes are to those parameters that can be applied post-acquisition, based on the general principal that integration must be appropriate and consistent. There is a strong preference to have identical automated parameters applied to all samples within each run, but it should be recognized that minor adjustments in parameters may be required due to variation in system performance (e.g., background noise). In rare instances manual or automated reintegration of individual samples may be required, and when they must be done, automated and manually integrated data should both be provided and the expectation is that these changes will undergo increased scrutiny by the regulators. Of note is that some agencies do not permit manual integration when employing a fully validated method to analyze samples from a pivotal comparative bioavailability study supporting market authorization. The scientific need for adjustment of integration parameters can be reconciled with regulatory concerns by ensuring clarity, transparency (i.e., a priori criteria established in SOPs) and documented explanations and will be considered by regulators on a fit for purpose basis. Chromatographic integration does not occur in isolation: method quality, data review, audit trails, training of bioanalysts and available SOPs with clear guidelines all contribute to adequately integrated samples.

Recommendations

Below is a summary of the recommendations made during the 10th WRIB.

Method development challenges in bioanalysis

  • When assaying small molecule biomarkers, it is recommended to use a SIL analyte instead of a surrogate matrix when feasible. When selecting the labeled forms, it is preferable to use 15N or 13C analytes if the introduction of these isotopes is chemically possible. A consistent response ratio within the small molecule biomarker assay between days is recommended as a minimum standard, while some propose an equivalent one as a higher level of performance. When using surrogate matrices, pooled incurred samples as endogenous QCs provide a good understanding of data reproducibility between days, which suffices the ISR needs during the early stage of drug development. The formal ISR in the traditional design may be performed at a later stage. Regardless of which surrogate approach is used, matrix or analyte, a parallelism experiment for small molecule biomarkers is recommended.

  • It was proposed that future revisions of the ICH S3A Q&A would describe various microsampling methods as examples so that the VAMS and other emerging useful microsampling devices will be widely recognized. Consideration should also focus on recommended sample volumes. VAMS are suitable in a discovery or development setting according to the 3Rs strategy; the procedure can be easily automated with good recovery, no significant matrix effect or aging of samples collected on VAMS. In order to mimic study samples, validation of these methods should use fresh blood only.

  • When CID fragmentation inefficiency is observed, survivor scans can be successfully used with compounds that do not fragment well by MS such as cyclic peptides, oligonucleotides, steroids, opiates or similar.

  • The qualification of the methods to measure total, released and encapsulated drug plasma concentration following nanoparticle drug dosing typically uses a tiered approach for early and exploratory studies and a fully validated method for later stage development end points. The necessity of the total, free and encapsulated assay depends on what questions need to be answered. In the situation whereby it is not possible to achieve a robust measurement of free drug, an indirect (e.g., metabolite) measurement can act as a surrogate for drug released from the formulation depending on the metabolic pathway. A good understanding of the relevant drug concentration for PK/PD modeling is needed and discussions with the regulators during planning are recommended.

  • When developing and validating methods for use with special population study samples, it is recommended to include appropriate matrix lots in pre-study method development experiments. Obtaining pre-study samples from special population participants is highly recommended.

  • Stability associated with collection, processing, and storage conditions should be considered during method development and validated to demonstrate the optimized collection conditions. Stabilizers should be used when required to maintain the integrity of the analyte or prevent conversion of labile metabolites (e.g., acyl glucuronides) into the analyte during sample collection and processing. Bioanalysts should properly understand the stability and solubility of the stabilizers and physicochemical property of the analyte or species being stabilized. Hazardous stabilizers such as concentrated acid and bases should be avoided for practical purposes. Any impactful iterations of sample collection on bioanalytical data such as blood cell association/equilibration, clotting, effect of differences in refrigerated centrifuges in different clinical sites, and delays in blood sample processing, as well as non-specific binding, especially urine samples should be evaluated during development and mitigated by sample processing instructions. It is essential that clear communication between the bioanalytical laboratory and the clinical site is established with unambiguous instructions in the sample collection manual. In the cases where dilutions of high concentration samples may be needed, long-term stability, freeze-thaw stability and the evaluation of other validation parameters should be determined on a case by case basis based on appropriateness and good science considering the purpose of the study and the impact. It is recommended to have as detailed a history as possible of how the samples are handled to ensure that the quality of samples is maintained for potential future use for biomarker analysis, especially for early development programs.

Bioanalytical regulatory challenges

  • If reference standard material and isotopically labelled material are available, then standard BMV PK assay criteria may be applied for small molecule biomarker methods. An endogenous QC sample should be included during method validation in order to demonstrate the suitability of the biomarker measurement method. The acceptance criteria of the endogenous QC sample should be fit-for-purpose and based on the biology and variability of the biomarker. The approach should be data driven, and determined on acquired validation data. Whenever possible, matrix pools from both healthy and patient populations should always be included in the biomarker validation.

  • Processed Batch Acceptance; Electronic Data Management and Integrity; Extract stability; and GCP Clinical Sample Bioanalysis:

    • Critical steps such as, but not limited to, the addition of internal standard should be considered when defining a batch in SOPs. QC samples should be in place to ensure adequate coverage to support the data throughout critical processing steps.

    • Recommendations regarding the use of ELN from the 2014 and 2015 White Paper in Bioanalysis conferences are still valid. Recent data integrity guidelines apply not only to traditional data collection methods, but to electronic records as well.

    • Best practice for extract stability dictates that stored low and high QC samples should be measured against a fresh calibration curve, and analyte/IS ratio should be used.

    • For non-EU submissions, it is recommended to adhere to the highest regulatory standard required in the company’s jurisdiction. Although there is no current expectation for adherence to the GCP for clinical laboratories guidance, many in the industry do use this document when building their quality systems.

  • It was agreed that safety issues with new drugs and their major metabolites are not always detected during clinical trials and may take many years to emerge. HRMS has already demonstrated to be of great utility in drug metabolism in early phase clinical trials since it is able to provide a more comprehensive insight into the complete biotransformation pathway of a drug and identification of potential inter-individual differences. It can be also used in later phase clinical trials for investigative studies when there appear to be safety issues with marketed drugs and there is no drug related explanation. Should new data be discovered, it should be disclosed and discussed with regulators if relevant for safety or efficacy. In the next 5–10 years HRMS is expected to the 1st choice in metabolite profiling.

  • Similar validation approaches can be used for both triple quadrupole platforms and HRMS, with some additional validation experiments being required for the latter (e.g., extraction windows). Recommendations regarding method validation using HRMS provided in previous White Papers [Citation7,Citation10] are still valid.

  • Chromatographic and mass spectrometric parameters which impact the way data is acquired should be fixed in validation. Permissible changes are to those parameters that can be applied post acquisition, based on the general principal that integration must be correct and consistent. However, these changes should be infrequent and based on documented procedures.

Key Terms

Endogenous substance: A molecule, protein or substance that originates from the biological matrix.

Microsampling: Sampling of very small sample volumes (μL) from animals and humans to assess drug and chemical exposure in biological matrix.

Nanoparticle: Small particle, between 1–100 nm that, for the purposes of this article, acts as a carrier of a pharmaceutical product.

Acknowledgements

The authors would like to acknowledge the US FDA, Europe EMA, UK MHRA, Austria AGES, Italy AIFA, Brazil ANVISA, Health Canada, Japan MHWL and WHO for supporting this workshop. S Richards (Sanofi), L Amaravadi (Sanofi/Genzyme), R Pillutla (Bristol-Myers Squibb), H Birnboeck (F. Hoffmann-La Roche Ltd.), F Garofolo (Angelini Pharma) for chairing the workshop and/or the white paper discussions. All the workshop attendees and members of the bioanalytical community who have sent comments and suggestions to complete this White Paper. W Garofolo, L Lu, X Wang, M Losauro, N Savoie, A Hernandez, K Kalaydjian and J. Conception for the assistance in the organization of the event. Future Science Group as a trusted partner.

Financial & competing interests disclosure

The authors have no relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript. This includes employment, consultancies, honoraria, stock ownership or options, expert testimony, grants or patents received or pending, or royalties.

No writing assistance was utilized in the production of this manuscript.

References

  • Savoie N , BoothBP, BradleyTet al. 2008 White Paper: The 2nd Calibration and Validation Group Workshop on recent issues in good laboratory practice bioanalysis. Bioanalysis1 (1), 19–30 (2009).
  • Savoie N , GarofoloF, van AmsterdamPet al. 2009 White Paper on recent issues in regulated bioanalysis from the 3rd Calibration and Validation Group workshop. Bioanalysis2 (1), 53–68 (2010).
  • Savoie N , GarofoloF, van AmsterdamPet al. 2010 White Paper on recent issues in regulated bioanalysis and global harmonization of bioanalytical guidance. Bioanalysis2 (12), 1945–1960 (2010).
  • Garofolo F , RocciM, DumontIet al. 2011 White Paper on recent issues in bioanalysis and regulatory findings from audits and inspections. Bioanalysis3 (18), 2081–2096 (2011).
  • DeSilva B , GarofoloF, RocciMet al. 2012 White Paper on recent issues in bioanalysis and alignment of multiple guidelines. Bioanalysis4 (18), 2213–2226 (2012).
  • Stevenson L , RocciM, GarofoloFet al. 2013 White Paper on recent issues in bioanalysis: “hybrid” – the best of LBA & LC/MS. Bioanalysis5 (23), 2903–2918 (2013).
  • Fluhler E , HayesR, GarofoloFet al. 2014 White Paper on recent issues in bioanalysis: a full immersion in bioanalysis (Part 1 – small molecules by LCMS). Bioanalysis6 (22), 3039–3049 (2014).
  • Dufield D , NeubertH, GarofoloFet al. 2014 White Paper on recent issues in bioanalysis: a full immersion in bioanalysis (Part 2 – hybrid LBA/LCMS, ELN & regulatory agencies’ input). Bioanalysis6 (23), 3237–3249 (2014).
  • Stevenson L , AmaravadiL, MylerHet al. 2014 White Paper on recent issues in bioanalysis: a full immersion in bioanalysis (Part 3 – LBA and immunogenicity). Bioanalysis6 (24), 3355–3368 (2014).
  • Welink J , FluhlerE, HughesNet al. 2015 White Paper on recent issues in bioanalysis: focus on new technologies and biomarkers (Part 1 – small molecules by LCMS). Bioanalysis7 (22), 2913–2925 (2015).
  • Ackermann B , NeubertH, HughesNet al. 2015 White Paper on recent issues in bioanalysis: focus on new technologies and biomarkers (Part 2 – hybrid LBA/LCMS and input from regulatory agencies). Bioanalysis7 (23), 3019–3034 (2015).
  • Amaravadi L , SongA, MylerHet al. 2015 White Paper on recent issues in bioanalysis: focus on new technologies and biomarkers (Part 3 – LBA, biomarkers and immunogenicity). Bioanalysis7 (24), 3107–3124 (2015).
  • Li W , CohenLH. Quantitation of endogenous analytes in biofluid without a true blank matrix. Anal. Chem. 75 (21), 5854–5859 (2003).
  • Jones BR , SchultzGA, EcksteinJA, AckermannBL. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules. Bioanalysis4 (19), 2343–2356 (2012).
  • Zheng JJ , ShieldsEE, SnowKJet al. The utility of stable isotope labeled (SIL) analogues in the bioanalysis of endogenous compounds by LC–MS applied to the study of bile acids in a metabolomics assay. Anal. Biochem. 503, 71–78 (2016).
  • ICH S3A Guideline: note for guidance on toxicokinetics: the assessment of systemic exposure in toxicity studies questions and answers (2016).
  • Denniff P , ParryS, DopsonW, SpoonerN. J. Pharm. Biomed. Anal. 108, 61–69 (2015).
  • Ciccimaro E , RanasingheA, D’ArienzoCet al. Strategy to improve the quantitative LC–MS analysis of molecular ions resistant to gas-phase collision induced dissociation: application to disulfide-rich cyclic peptides. Anal. Chem. 86 (23), 11523–11527 (2014).
  • Boulieu R , BleyzacN, FerryS. Modified high-performance liquid chromatographic method for the determination of ganciclovir in plasma from patients with severe renal impairment. J. Chromatogr. 571 (1–2), 331–333 (1991).
  • Anderson MDG , BreidingerSA, WoolfEJ. Effect of disease state on ionization during bioanalysis of MK-7009, a selective HCV NS3/NS4 protease inhibitor, in human plasma and human Tween-treated urine by high-performance liquid chromatography with tandem mass spectrometric detection. J. Chromatogr. B Analyt. Technol. Biomed. Life Sci. 877 (11–12), 1047–1056 (2009).
  • Xu Y , ProhnM, CaiXet al. Direct comparison of radioimmunoassay and LC–MS/MS for PK assessment of insulin glargine in clinical development. Bioanalysis6 (24), 3311–3323 (2014).
  • Vaught J . Blood collection, shipment, processing, and storage. Cancer Epidemiol. Biomarkers Prev. 15 (9), 1582 (2005).
  • Saunders WB . Specimen collection and processing: sources of biological variation. In : Textbook of Clinical Chemistry. TiezMW ( Ed.). Philadelphia, PA, USA, 478–518 (1986).
  • Allison RW . Sample collection and handling: getting accurate results. Vet. Clin. North Am. Small Anim. Pract. 37 (2), 203–219 (2007).
  • US Department of Health and Human Services, US FDA, Center for Drug Evaluation and Research, Center for Veterinary Medicine. Guidance for Industry, Bioanalytical Method Validation. Rockville, MD, USA (2001).
  • EMA . Committee for Medicinal Products for Human Use (CHMP). Guideline on Bioanalytical Method Validation. EMEA/CHMP/EWP/192217/2009. London, UK (2011).
  • Japanese Ministry of Health, Labour and Welfare. Guideline on Bioanalytical Method Validation in Pharmaceutical Development. Japan (2013).
  • US Department of Health and Human Services, US FDA, Center for Drug Evaluation and Research, Center for Veterinary Medicine. Draft Guidance for Industry, Bioanalytical Method Validation. Rockville, MD, USA (2013).
  • Arnold ME , BoothB, KingL, RayC. Workshop report: Crystal City VI – bioanalytical method validation for biomarkers. AAPS J. doi: 10.1208/s12248-016-9946-6 (2016) ( Epub ahead of print).
  • Lee JW , DevanarayanV, BarrettYCet al. Fit-for-purpose method development and validation for successful biomarker measurement. Pharm. Res. 23 (2), 312–328 (2006).
  • Hougton R , GoutyD, AllinsonJet al. Recommendations on biomarker bioanalytical method validation by GCC. Bioanalysis4 (20), 2439–2446 (2012).
  • Timmerman P , HerlingC, StoellnerDet al. European Bioanalysis Forum recommendation on method establishment and bioanalysis of biomarkers in support of drug development. Bioanalysis4 (15), 1883–1894 (2012).
  • Bower J , FastD, GarofoloFet al. 8th GCC: Consolidated feedback to US FDA on the 2013 Draft FDA Guidance on Bioanalytical Method Validation. Bioanalysis6 (22), 2957–2963 (2012).
  • Rocci M , LowesS, ShoupRet al. 7th GCC Insights: incurred samples use; fit-for-purpose validation, solution stability, electronic laboratory notebook and hyperlipidemic matrix testing. Bioanalysis6 (20), 2713–2720 (2014).
  • Hayes R , LeLacheurR, DumontIet al. 9th GCC closed forum: CAPA in regulated bioanalysis; method robustness, biosimilars, preclinical method validation, endogenous biomarkers, whole blood stability, regulatory audit experiences and electronic laboratory notebooks. Bioanalysis8 (6), 487–495 (2016).
  • US Department of Health and Human Services, US FDA, Center for Drug Evaluation and Research, Center for Biologics Evaluation and Research, Center for Veterinary Medicine. Draft Guidance for Industry, Data Integrity and Compliance with CGMP. Rockville, MD, USA (2016).
  • MHRA GMP Data Integrity Definitions and Guidance for Industry . London, UK (2015).
  • Song A et al. 2016 White Paper on recent issues in bioanalysis: focus on biomarker assay validation (BAV): (Part 2 – Hybrid LBA/LCMS and input from regulatory agencies). Bioanalysis doi: 10.4155/bio-2016-4988 (2016) ( In Press).
  • EMA . Committee for Medicinal Products for Human Use (CHMP). Reflection paper on guidance for laboratories that perform the analysis or evaluation of clinical trial samples. EMA/INS/GCP/532137/2010. London, UK (2012).
  • ICH Guideline for Good Clinical Practice E6(R1) (1996).
  • US Department of Health and Human Services, US FDA, Center for Drug Evaluation and Research. Guidance for Industry, Safety Testing of Drug Metabolites. Rockville, MD, USA (2008).
  • Ma S , ChowdhurySK. Data acquisition and data mining techniques for metabolite identification using LC coupled to high-resolution MS. Bioanalysis5 (10), 1285–1297 (2013).
  • Leonart LP , GasparettoJC, PontaroloR. Uncovering previously undetected metabolites using modern bioanalytical tools. Bioanalysis7 (12), 1419–1422 (2015).
  • Bowen CL , KehlerJ, BoramSet al. Modify on the fly: triple quad to high resolution in support of a dermal clinical study requiring an ultra low LLOQ. Bioanalysis8 (3), 205–214 (2016).
  • Sturm RM , JonesBR, MulvanaDE, LowesS. HRMS using a Q-Exactive series mass spectrometer for regulated quantitative bioanalysis: how, when, and why to implement. Bioanalysis8 (16), 1709–1721 (2016).
  • Woolf EJ , McDougallS, FastDMet al. Small molecule specific run acceptance, specific assay operation, and chromatographic run quality assessment: recommendation for best practices and harmonization from the global bioanalysis consortium harmonization teams. AAPS J. 16 (5), 885–893 (2014).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.