2,432
Views
16
CrossRef citations to date
0
Altmetric
Editorial

Toxicogenomics in drug development: a match made in heaven?

, , , , &
Pages 847-849 | Received 18 Feb 2016, Accepted 04 Apr 2016, Published online: 21 Apr 2016

1. Introduction

Compound toxicity accounts for approximately half of all drug failures during development. Currently accepted preclinical studies for drug safety evaluation are time, resource, and animal intensive with often limited clinical predictivity. It is thus highly desirable to develop more efficient and predictive tools for early detection and assessment of potential compound liabilities. The emergence of genomics technologies over the last two decades promised to provide a solution.

The premise of toxicogenomics (TGx) is straight forward: compounds with similar toxicity mechanisms and outcomes should perturb the transcriptome similarly and these perturbations could be used as more efficient and/or more predictive biomarkers of downstream toxicity outcome. This concept was reinforced by a number of pioneering studies demonstrating, for example, strong correlations between histopathology, clinical chemistry, and gene expression when different hepatocellular injuries were induced by chemical agents as reviewed in.[Citation1,Citation2] With such early advances, TGx was poised for earlier detection of a vast variety of drug-related outcomes, covering histopathologies across various organs, carcinogenicity, reproductive toxicity, etc., while deciphering mechanisms of action to create a more predictive and resource-sparing battery of tests for hazard identification, risk assessment, toxicity monitoring, and problem-solving across the drug development pipeline.

This paradigm shift was anticipated to liberate the pharmaceutical and chemical industries from the current burden of toxicity liabilities, by enabling faster development of clinically safer compounds while reducing cost, infrastructure, and animal requirements.[Citation1Citation3] TGx and drug discovery/development was expected to be a match made in heaven.

2. The advance of toxicogenomics

Pharmaceutical companies, academic institutions, and government agencies committed swiftly and heavily to this approach.[Citation1,Citation2] Numerous pharmaceutical companies made initial intensive investments in TGx such as the acquisition of Rosetta Inpharmatics by Merck & Co for $620 million in 2001. In 2006, the Predictive Safety Testing Consortium (PSTC) was established to qualify more efficient and accurate biomarkers for nephrotoxicity, hepatotoxicity, vascular injury, muscle toxicity, and carcinogenicity prediction. Additional examples include both government (Food and Drug Administration (FDA) and Environmental Protection Agency (EPA) groups, NIBIO of Japan) and commercial efforts (ICONIX, GeneLogic, Ingenuity, Thomson Reuters) to generate databases of TGx responses to model toxicants in order to facilitate the discovery and application of TGx signatures.

Progress has also been made worldwide to consolidate or link relevant public databases to create information-rich network and analyses tools (such as National Institute of Environmental Health Science’s Comparative Toxicogenomics Database and the EPA’s Distributed Structure-Searchable Toxicity database). Consortia were formed to promote data standardization, consistency in data classification, such as the global Adverse Outcome Pathway (AOP) strategy of the Organisation for Economic Co-operation and Development, efficient large TGx data storage and sharing, and best analyses practices.[Citation4]

3. The current state and impact of toxicogenomics

Two decades later, genomics platforms have advanced from partial genome microarrays to genome-wide sequencing with improved accuracy, cost, turnaround time, and coverage. Many companies are collecting full-genome sequencing data to inform clinical trials. TGx has contributed to the discovery of diagnostic toxicity biomarkers (such as Kim-1 for renal toxicity) that can outperform conventional biomarkers.[Citation5] There have been numerous publications demonstrating promise of TGx for more accurate and/or earlier prediction of toxicity outcomes such as vascular injury and carcinogenicity.[Citation1,Citation6Citation8] For example, proposed TGx biomarkers of rodent liver carcinogenicity are reported to approach accuracies of 90% noting that this exceeds the carcinogenicity prediction accuracy of FDA-mandated tests for genotoxicity.[Citation1,Citation6,Citation7] However, PSTC efforts to collaborate across industry to cross-validate a TGx signature were disappointing.[Citation9] TGx has also contributed to the mechanistic understanding of some toxicity issues, for example, explaining rodent-specific carcinogenesis associated with peroxisome proliferator-activated receptor α agonists.[Citation10]

However, TGx has not been widely incorporated into drug discovery pipelines. The rapid growth in TGx publications has declined over the last decade, as did submissions to the Voluntary eXploratory Data Submission program of the FDA.[Citation1,Citation11] Commercial TGx efforts have also disbanded or waned (e.g. ICONIX, GeneLogic). Most importantly, TGx has not been approved to replace or reduce standard regulatory testing, and thus the anticipated reduction in toxicological risk assessment resources has not been realized.

There are multiple reasons for the limited advancement and incorporation of TGx in drug discovery and development pipelines [Citation1,Citation12]:

  1. Ambiguities in phenotypic anchoring and/or compound classification. Accurate phenotypic anchoring of TGx profiles to confirmed outcomes, conventional toxicity parameters (such as histopathology and clinical chemistry), and to mechanistic physiology is of paramount importance for predictive biomarker identification. As toxic effects and associated biomarkers can vary by species, strain, tissue, dose, time, and exposure, the selection of clinically relevant paradigms and compounds for biomarker development and testing is frequently difficult or different between research groups. Obtaining consensus on toxicity/compound classification, phenotypic anchors, and clinically relevant paradigms could help in this regard.

  2. Limited sensitivity/specificity. TGx cannot be expected to predict all clinical toxicities. Preclinical animal models can be blind to human-specific mechanisms. Further, focal toxicity might not be detectible in broad tissue screening, or relevant tissues might not be accessible for routine analyses. Conversely, TGx biomarkers are not necessarily specific and are often impacted by orthogonal biological signals. Deconvolution of specific toxicity signals requires large data sets as well as advanced multivariate statistics and pathway mining strategies that are now becoming widely available. It is also critical to account for clinically relevant exposure margins, meaningful biological response windows, and fit-for-purpose accuracy in establishing TGx biomarker thresholds and testing paradigms.[Citation13]

  3. Nonoptimal analytical assessment and/or limited statistical power due to resource constraints. The vast majority of reported TGx signatures are derived by overfitting a small number of compounds and are subsequently associated with limited accuracy in prospective studies. This is because TGx biomarker identification can be confounded both by the multiplicity of testing thousands of endpoints and by the inherent multiplicity of drug pharmacology and biological responses. The latter frequently includes on-/off-target biology, adaptive metabolism/resiliency, and/or multiple concurrent toxicities. While these confounders may be overcome by using a large number of compounds for training and testing and by using appropriate statistical corrections, doing so may not be possible or practical due to resource limitations. While TGx assays have become significantly cheaper and more high-throughput, resources for sufficiently powered animal studies and the availability of an adequate number of phenotypically anchored compound/models remain limiting factors. Leveraging the expanding landscape of TGx databases (discussed above) will help in this regard.[Citation1,Citation14]

  4. Expressed fear by sponsors of alternative regulatory interpretation of TGx data. The FDA has clarified [Citation15] that while TGx data generated from regulated studies would be expected in NDA submissions, TGx results do not need to be submitted to open IND applications unless the data are used to impact clinical study decisions. Concerns continue, however, regarding the need to submit ambiguous data to regulatory agencies and the potential for unnecessarily conservative interpretations to slow drug development.

4. Expert opinion

The science underlying TGx is more complex and challenging than initially anticipated. Advancing the field requires improved study designs, accurate compound classification, appropriate phenotypic anchoring, integrated predictive modeling over larger datasets, refinement of AOP understanding, and establishing specific regulatory guidance. Nevertheless, there is optimism for practical and lasting impact of TGx on drug development in the future:

  1. Although accurate TGx signatures might not be feasible for as many different tissues and mechanisms as initially assumed, there are numerous published examples where TGx does add value for predicting certain toxicity outcomes and explaining toxicity mechanisms. In addition, TGx can contribute to accessible toxicity biomarker development (e.g. microRNAs) or help identify molecular initiating events in AOP analyses.

  2. TGx also has utility for guiding development of better in vitro/in vivo humanized models. By cataloging baseline expression of mRNA and changes in response to xenobiotics, TGx can serve to assess the global biological similarity of 2D/3D in vitro organ systems to the in vivo environment. Many complex human cell culture models are under development for toxicity testing (such as HepatoPac, 3D printed cultures, organ-on-a-chip, and patient-derived iPSCs) that can leverage genomics for model characterization, biomarker development, and identification of toxic responses.

While TGx is not yet a substitute for routine-regulated toxicity testing, it can be informative for internal decision-making by achieving reasonably predictive metrics during preclinical development. Further it can be applied to rapidly derisk compound series of undesirable mechanistic signatures using appropriate cellular models. For advancing future regulatory applications, opportunities exist such as mentioned in proposed modifications to ICH S1 guidance for rodent carcinogenicity testing by considering the value of data from new biomarkers and technologies to distinguish human relevant from rodent-specific mechanisms and to guide 2-year rat study waivers. In addition, proposed modifications to ICH S5 guidance to reproductive toxicity testing point to the value of in vitro TGx test systems to reduce and defer extensive animal testing. TGx in drug discovery and development may not be the match made in heaven originally postulated, but it still holds promise to be a lasting and useful member of the drug-derisking toolbox.

Declaration of interests

Funding for this work was provided by Merck & Co., Inc. Kenilworth, New Jersey, USA. All authors are currently employed by Merck & Co., Inc. Kenilworth, New Jersey, USA and own stock/stock options in the company. The authors have no other relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript apart from those disclosed.

References

  • Chen M, Zhang M, Borlak J, et al. A decade of toxicogenomic research and its contribution to toxicological science. Toxicol Sci. 2012;130(2):217–228.
  • Afshari CA, Hamadeh HK, Bushel PR. The evolution of bioinformatics in toxicology: advancing toxicogenomics. Toxicol Sci. 2011;120 Suppl 1:S225–S237.
  • Boverhof DR, Zacharewski TR. Toxicogenomics in risk assessment: applications and needs. Toxicol Sci. 2006;89(2):352–360.
  • Hendrickx DM, Boyles RR, Kleinjans JC, et al. Workshop report: identifying opportunities for global integration of toxicogenomics databases, 26–27 June 2013, Research Triangle Park, NC, USA. Arch Toxicol. 2014;88(12):2323–2332.
  • Chiusolo A, Defazio R, Zanetti E, et al. Kidney injury molecule-1 expression in rat proximal tubule after treatment with segment-specific nephrotoxicants: a tool for early screening of potential kidney toxicity. Toxicol Pathol. 2010;38(3):338–345.
  • Waters MD, Jackson M, Lea I. Characterizing and predicting carcinogenicity and mode of action using conventional and toxicogenomics methods. Mutat Res. 2010;705(3):184–200.
  • Brambilla G, Martelli A. Genotoxicity and carcinogenicity studies of analgesics, anti-inflammatory drugs and antipyretics. Pharmacol Res. 2009;60(1):1–17.
  • Dalmas DA, Scicchitano MS, Mullins D, et al. Potential candidate genomic biomarkers of drug induced vascular injury in the rat. Toxicol Appl Pharmacol. 2011;257(2):284–300.
  • Fielden MR, Adai A, Dunn RT, et al. Development and evaluation of a genomic signature for the prediction and mechanistic assessment of nongenotoxic hepatocarcinogens in the rat. Toxicol Sci. 2011;124(1):54–74.
  • McMillian M, Nie AY, Parker JB, et al. Inverse gene expression patterns for macrophage activating hepatotoxicants and peroxisome proliferators in rat liver. Biochem Pharmacol. 2004;67(11):2141–2165.
  • Goodsaid FM, Amur S, Aubrecht J, et al. Voluntary exploratory data submissions to the US FDA and the EMA: experience and impact. Nat Rev Drug Discov. 2010;9(6):435–445.
  • Khan SR, Baghdasarian A, Fahlman RP, et al. Current status and future prospects of toxicogenomics in drug discovery. Drug Discov Today. 2014;19(5):562–578.
  • Thomas RS, Philbert MA, Auerbach SS, et al. Incorporating new technologies into toxicity testing and risk assessment: moving from 21st century vision to a data-driven framework. Toxicol Sci. 2013;136(1):4–18.
  • Sandhu KS, Veeramachaneni V, Yao X, et al. Release of (and lessons learned from mining) a pioneering large toxicogenomics database. Pharmacogenomics. 2015;16(8):779–801.
  • Sistare FD, DeGeorge JJ. Applications of toxicogenomics to nonclinical drug development: regulatory science considerations. In: Mendrick DL, Mattes WB, editor. Methods in molecular biology. Totowa (NJ):Humana Press. Vol. 460. 2008. pp. 239–261.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.