In part two of this blog series about best practices in toxicology, we continue the discussion from the previous post, which described the aspects of drug discovery that could impact safety. Because safety can vary by person and organ, we also need to identify how we can determine a drug’s potential toxic effects on specific organs and patients via toxicogenomics, pharmacogenomics, and pharmacoepigenomics.
Understanding organ-specific toxicity while reducing the burden on animal testing
Animal testing increases the cost of drug development, and there are persistent ethical concerns on the use of animals for biomedical research. In addition, underlying differences in biological and metabolic pathways between animal models and humans have often resulted in a lack of translatability. These differences manifest as poor drug efficacy in humans and/or adverse events during the clinical stage, both of which significantly contribute to drug attrition.
Replacing animal testing with in vitro and in silico approaches could reduce the need for animal testing and refine the experimental design. In a report from a transatlantic think tank on toxicology, 11 authors suggest the use of an integrative testing strategy, with an emphasis on facilitating multiple in vitro testing methods to draw conclusions and to model and predict adverse outcomes with 100% accuracy using a systems biology approach.
Indeed, high-throughput screening has significantly impacted our ability to quickly test libraries of compounds at multiple dosing intervals. Moreover, recent advances in stem cell, gene editing, and microfluidics technologies has made it easier to execute 2D, 3D, and cell co-culturing techniques on a large scale. When combined with the right cell/tissue/organ assay, toxicity studies can be performed on a large scale in vitro to recreate relevant tissue microenvironments. The approach of collecting a large number of data points would significantly improve the development of more accurate prediction models and provide a mechanistic understanding of pathways underlying adverse events (adverse outcome pathways). This would help researchers to reliably make an informed go or no-go decisions when developing new molecular entities (NMEs) or investigating the potential to repurpose drugs.
The personalized medicine approach
Next-generation sequencing technology has advanced at an incredibly fast pace, leading to low costs in whole-genome sequencing and the ability to make multitudes of comparisons based on a broad range of factors. This has ushered in the era of toxicogenomics (and pharmacogenomics), which has provided researchers with large quantities of OMICs datasets to examine the mechanistic role of a drug or a toxicant in adverse events.
Similarly, there is growing interest in understanding how epigenetics plays a role in determining susceptibility or increased risk for toxicity – also called pharmacoepigenomics. When the expression of certain genes involved in the pharmacogenomics and pharmacodynamics of a therapeutic is modulated, there could be effects on drug efficacy and safety.
Understanding the disease biology at the patient level could, therefore, have major implications for treatment regimens where a drug and the dose are customized based on known factors contributing to disease. Furthermore, a comprehensive understanding of biomarkers used for monitoring or predicting treatment toxicity could help guide clinicians through this process.
One of the important considerations for understanding both organ-specific toxicity and personalized medicine is the ability to process and manage large amounts of data. This is challenging without the assistance of technology in the form of artificial intelligence and machine learning, the roles of which we’ll explore in part three of this blog series.
Understanding the disease biology at the patient level could therefore have major implications for treatment regimens where a drug and the dose are customized based on known factors contributing to a disease.