Dr Matthew Clark, director of scientific services at Elsevier discusses what animal testing really tells us about human safety.
animal testing
Animal testing has been vital to medical research and understanding diseases for years. Of the 106 Nobel Prizes awarded for Medicine or Physiology, 94 have involved animal tests in their research. However, given the continued pressure from governments, societies, and animal welfare groups, life science companies are looking for ways to decrease animal testing. A 2010 directive from the European Union specifically calls on companies to “reduce and refine the use of animals” for scientific purposes.
At the same time, pharmaceutical companies also have to ensure they balance the need to reduce the amount of animal testing with the demands of ensuring patient safety. Big pharma is currently looking for ways to collect and utilise real world evidence (RWE) to this end, and a recent study has used big data analysis to find ways of potentially reducing animal testing. This research has shown that, hidden away among the millions of published records, there are valuable insights that can allow companies to reduce the amount of animal testing required while maintaining, or even increasing, the safety of medical research.
Big data – more than a one trick pony
The big data study, conducted by Elsevier in conjunction with Bayer, evaluated the ability of animal studies to predict human safety. The aim of the research was to assess the concordance of findings between preclinical animal testing and observations made in human clinical trials. This was then used to determine how predictive certain animal tests are compared to tests of the same drugs on humans. The study analysed more than 1.6 million adverse events reported for humans, alongside the five animals most commonly used in FDA and EMA regulatory documents, for over 3,000 approved drugs and formulations.
One of the key findings from the study showed there are drug effects that are specific to animals, in some cases due to specific physiological differences, as well as differences in animal biology. For example, there are several gallbladder-related issues that have been recorded in animals but not humans. Yet, when it comes to cardiac events, such as arrhythmia, there is a high level of consistency between animal and human responses. However, the study also found some events reported in animals have never been reported in a human, and some events observed in humans have never been reported in an animal study. For example cholestatic hepatitis is seen in animal models, but has not been reported in human in a regulatory filing for drug approval. There is a cluster of other liver issues that are not represented well in animal models and have not been reported in animal studies in drug approval documents. To some extent this is because the human patients are carefully monitored so that doses can be adjusted before these events occur.
In order to reduce unnecessary animal testing, it is vital for researchers to know which species are most predictive for each adverse event. Although it has always been generally accepted that animals predict human responses, the concordance has never before been investigated on this scale, to this level of detail.
The golden goose?
The findings from this study will enable pharmaceutical firms to continue innovating safely and humanely, while continuing to search for life-changing therapies. Significantly, safety issues are the second biggest reason, after efficacy, for drugs failing clinical trials and not making it to market. However, the study also raises the question of which other areas in drug discovery and development have been accepted as-is, and could benefit from the analysis of big data. Analysis of these particular areas would further the knowledge of researchers and question existing expectations, especially as data is now becoming more readily available.
Today, thanks to the ease of genomic sequencing, the digitalisation of existing research, and a range of new devices providing useful insights for research, the pharmaceutical industry has access to more data than ever before. Additionally, the industry now has the tools necessary to filter this deluge of information in order to derive meaningful, actionable outcomes. As technology continues to advance, these tools, such as AI and machine learning, are in turn becoming better at filtering and understanding the vast quantities of data available.
This changing environment means the potential advances for the industry are expanding exponentially, allowing for even more safe and humane breakthroughs in the future. However, also expanding is the amount of incorrect, invalid or unsuitable data that could be filtered into these systems. This is where big pharma needs to ensure it is providing researchers with accurate, high quality, up-to-date data to enable them to make valuable contributions to drug discovery and development. Big data can only provide value in drug discovery and development if it meets these standards. Otherwise, the pharma industry will find data a barrier rather than a driver of innovation.