We used natural language processing to analyse clinical trial protocols in the pharmaceutical industry. Pharma companies write a 200-page protocol at the planning stage of a clinical trial, and our model is able to ‘read’ the document and output a number of complexity metrics.
When a pharmaceutical company develops a drug, it needs to pass through several phases of clinical trials before it can be approved by regulators.
Before the trial is run, the drug developer writes a document called a protocol. This contains key information about how long the trial will run for, what is the risk to participants, what kind of treatment is being investigated, etc.
The problem is that each protocol is up to 200 pages long and the structure can vary.
For the German pharma company Boehringer Ingelheim, we developed and trained a deep learning tool using natural language processing (NLP) to predict more than 50 output variables from a clinical trial protocol. This allows pharma companies and regulators to analyse and quantify large numbers of clinical trial protocols, allowing more accurate cost estimation.
The technique can be extended to other industries where large unstructured or semi-structured documents are the norm.
If you have a problem of this nature please get in contact and we will be glad to discuss.
AI has great potential to revolutionise many aspects of the pharmaceutical industry, from pre-clinical stages such as in silico drug discovery through to clinical trials and aftermarket monitoring of key opinion leaders (KOLs). At Fast Data Science we are at the forefront of AI in pharma and have worked on projects in the pre-clinical, clinical and KOL stages of the drug development lifecycle. Read more here about how researchers are using AI in the pharmaceutical industry. We have primarily focused on NLP projects in the pharmaceutical industry but have also worked on more general data science projects such as complexity and risk estimation.
At Fast Data Science we worked on a natural language processing project for a pharmaceutical company which needed to predict the risk of clinical trials ending uninformatively. We developed a web-based tool which allowed a non-technical user to drag and drop a PDF file of a clinical trial protocol. The tool converted the PDF to raw text and extracted a number of key properties of the trial, such as the number of subjects, location, pathology, presence of a statistical analysis plan (SAP), effect estimate, and simulation for sample size determination. These properties were fed into a risk model which rated the trial as low, medium or high risk, and produced an easily exportable PDF report for users to share with colleagues.
Novartis is using machine learning to predict which untested compounds are likely to be biologically active and worth investigating in vitro.
The team used a supervised learning approach, teaching their system how to recognise effects from treatment, such as changes in a cell’s shape. They trained their network by showing it images of cells treated with known compounds. color=“Default” quote=“Machine learning is poised to accelerate a number of critical steps, and we think it could speed up discovery and development for many of our projects.” name=“Jeremy Jenkins,” subtitle=“Head of Informatics for Chemical Biology and Therapeutics at the Novartis Institutes for BioMedical Research (NIBR)” Verge Genomics is using AI to predict the effect of new treatments for Alzheimer’s patients. The company has built one of the largest databases of brain tissue sequences in the world. Their database contains tissue from more than 1,000 human brains.
Verge is seen as a pioneer in AI for drug discovery, after decoding the DNA of patients who have died of neurodegenerative diseases and developing a machine learning model to find genes that could serve as targets for new drugs. Linguamatics is using natural language processing to enable pharma companies to conduct patent landscape analysis.
Patent landscape reports show a snapshot of the patent situation in a given market. Patent analysis can be a labour intensive task, but a number of AI companies are now offering AI tools to assist in patent landscaping. In 2017 a report by the Department of Engineering at the University of Cambridge identified artificial intelligence, neural networks, and NLP as key priority technologies needed to be adopted in patent landscaping to overcome the challenges faced in the field.