You are opening our English language website. You can keep reading or switch to other languages.
16.04.2024
9 min read

AI in Clinical Trials: Increase Speed and Efficiency while Cutting Cost and Error Rates

Clinical trials have traditionally been constrained by manual, error-prone, labor-intensive, and costly procedures, consuming a substantial portion of the 10-12 years it takes a drug to move from a lab to the pharmacy shelf. This comprehensive analysis by Daniel Piekarz and Andrey Sorokin highlights how AI applications are helping to speed up and improve efficiency of each phase of clinical trials, from trial protocol design to patient recruitment and regulatory compliance.

AI in Clinical Trials: Increase Speed and Efficiency while Cutting Cost and Error Rates

Article by

Andrey Sorokin
Andrey Sorokin

The journey of a drug from a lab to the pharmacy shelf typically spans 10-12 years, with clinical trials consuming a substantial portion of this period. Clinical trials have traditionally been constrained by manual, error-prone, labor-intensive, and costly procedures. Research indicates that manual data handling alone can lead to error rates as high as 27.3%. Patient recruitment and retention also present significant hurdles, with 80% of trials failing to meet enrollment deadlines, contributing to delays and increased costs.

The impact of AI on clinical trials is acknowledged by leading industry analysts. We believe, DataArt’s role is demonstrated by its recognition as a Sample Vendor for AI in Clinical Development use case in the Gartner® Hype Cycle™ for Life Science Clinical Development, from 2021 to 2023.

The adoption of AI is altering the clinical trials landscape, offering companies new capabilities to efficiently manage trial protocol design, patient recruitment, informed consent, data collection and analysis, safety monitoring, and regulatory compliance. Our analysis will cover each of these stages, showcasing how AI can improve efficiency and optimize processes.

Trial Protocol Design

Trial protocols are essential for guiding trial conduct, addressing key elements such as patient safety, informed consent, and confidentiality agreements, as well as for managing adverse events. Poor study design can adversely impact trial cost, efficiency, and the likelihood of successful outcomes. Machine Learning models like Bidirectional Encoder Representations from Transformers (BERT), Generative Pre-trained Transformer (GPT), and Natural Language Processing (NLP) techniques, offer significant improvements to protocol development by automating extraction and analysis of data from extensive healthcare datasets.

Image

Conventionally, protocol design entails a laborious manual review of copious data, spanning prior research findings, clinical trial results, patient health records, and demographic information. This method is susceptible to errors and inefficiency, often leading to delays and oversights.

Infographic on clinical trial protocol complexities, noting 10-20% failure due to design flaws and 45% amendment rate post-regulatory feedback, highlighting the need for efficiency.

AI models, such as BERT can process this data with greater speed and precision, grasping the context and detecting patterns that may elude manual analysis, such as spotting a correlation between certain pre-existing conditions and a drug's efficacy or side effects, which is vital for establishing accurate inclusion and exclusion criteria for the trial.

These advancements lead to trial protocols that are more precisely tailored to the particular medication and patient demographics, optimizing the trial workflow and enhancing the probability of success. Additionally, they diminish the risk of costly protocol revisions during the trial, improve the chances for regulatory approval, and accelerate the introduction of new treatments to the market.

The integration of advanced models and frameworks like GPT, T5, XLNet, ELECTRA, DeBERTa, BigBird, and domain-specific models such as BioBERT or ClinicalBERT advances clinical trial protocol design even further. They offer more specialized tools for predictive analyses, structuring research inputs, interpreting complex medical documents, and providing rapid and comprehensive data processing.

While the structure of clinical trial protocols is predominantly standardized, typically up to 68%, they average around 100 pages, demanding significant labor to compile and review. Language Model-based AI technologies can optimize the operational workflow of this phase by tapping into extensive databases of existing protocols from various institutions and labs and producing nuanced drafts that are meticulously adapted to the unique characteristics of each new trial. This innovation represents a paradigm shift in the way clinical trial protocols are developed and refined, introducing a new degree of precision and personalization.

Patient Recruitment

Patient recruitment is traditionally a challenging and time-consuming process that frequently slows down clinical trials, inflating their costs, and is often burdened by low enrollment rates, leading to trial postponements and cancellations. Furthermore, trials must adhere to rigorous eligibility criteria, while prospective participants frequently have concerns about their safety, the time commitment involved, or may simply be unaware of trials for which they qualify.

Image

Our software firm was approached to develop a machine-learning model aimed at expediting patient recruitment. Employing advanced machine learning techniques, including Named Entity Recognition (NER), dependency parsing, and eligibility criteria extraction via KNN algorithms and ontology mapping, our solution modernized the client’s patient recruitment framework. By efficiently analyzing extensive patient databases, our system identified candidates who precisely matched the trial's complex inclusion and exclusion criteria. This streamlined the recruitment process while significantly reducing associated costs and accelerating the trial timelines.

Such predictive analytics and recommendation systems can comb through vast patient eligibility screening databases, identifying suitable candidates through comprehensive analysis of their medical history, health status, and demographic information.

Evidence of this approach’s efficacy was highlighted in a 2023 Nature Digital Medicine journal study, which found that AI-augmented patient recruitment could slash costs by as much as 70% and accelerate clinical trials by up to 40%.

Another critical AI application worth noting is the creation of synthetic cohorts in clinical trials. These cohorts, generated by AI algorithms, emulate real patient groups by analyzing aggregated data from past clinical studies, real-world evidence, and other health-related datasets. Research suggests that the use of synthetic data in trial simulations can reduce recruitment expenses by over 20%.

Informed Consent

The informed consent process is a critical step in clinical trials, ensuring that participants are fully aware of the trial's scope and risks.

Filling out Informed Consent Forms (ICFs) is a process often marred by the need for frequent revisions due to protocol and regulatory changes, creating additional administrative work and inefficiency. To address this challenge, DataArt’s AI Lab engineered a generative AI solution to automate the ICF workflow. The solution automatically fills in necessary details within a general ICF template by drawing relevant data from study protocols, including the drug name, study title, and protocol number. This automation ensures the precise and efficient creation of ICFs tailored to each clinical trial while reducing the risk of human error.

Additionally, the adoption of even basic NLP techniques can simplify complex consent documentation, making it clear and accessible for participants from various backgrounds. This method is instrumental in maximizing participants understanding of the trial’s parameters, potential risks, and benefits.

Our team has taken the interactive component of the consent process a step further by deploying AI-powered chatbots and virtual assistants to provide immediate, context-aware responses to participant’s questions. These chatbots, powered by generative AI, facilitate more natural and engaging conversations, adding a layer of personalization to the consent experience. They adjust their communication to suit each participant's level of understanding and specific concerns, ensuring that information is conveyed thoroughly and personally.

Furthermore, AI systems support regulatory compliance and maintain audit trails by automating the documentation, update, and storage of consent forms.

Data Collection and Analysis

Data collection in clinical trials traditionally involves patient self-reports, clinical evaluations, and laboratory tests, complemented by site visits that ensure adherence to the protocol and data quality.

This phase relies heavily on statistical methods to draw conclusions about the effectiveness and safety of the intervention being tested. AI's role in this phase is pivotal. Machine Learning (ML) algorithms outperform traditional statistical techniques in processing extensive, multifaceted datasets. Their superiority at uncovering complex patterns that might be overlooked or undetectable through traditional analysis techniques offers substantial benefits in clinical trials, which often encompass diverse variables, from patient demographics to intricate biological markers.

Furthermore, AI can be utilized to process handwritten notes and multilingual documents. We anticipate that 2024 will be marked by the integration of multi-modal AI in healthcare, which will comprehensively analyze disparate information types, such as text, DICOM images, and blood analysis, among others.

Safety Monitoring

The advent of wearable devices has introduced a new dimension to patient monitoring by capturing a continuous stream of health-related data such as vital signs, physical activity, and biometric markers, offering a more comprehensive and accurate picture of the patient's health status.

The challenge lies in processing the vast volumes of data generated by these devices. This is where cloud computing, in conjunction with AI, becomes invaluable. Platforms like AWS IoT and Azure IoT Hub offer the infrastructure needed to support AI algorithms that can analyze this data in real-time, providing an enhanced layer of safety monitoring.

For instance, they can promptly detect deviations from the expected clinical outcomes, or early signs of adverse reactions, thus enabling timely interventions. This proactive approach to monitoring is a significant enhancement over traditional methods, as it allows for immediate response to potential issues, improving patient safety and the overall trial quality.

Cloud platforms like AWS and Azure enable cost-effective adjustments to resource allocation to match demand fluctuations, increasing resources during high data collection phases and reducing in quieter periods.

Additionally, AI can extend safety monitoring to incorporate a semantic analysis of real-time social media feeds, like Twitter (X), for public sentiment on drugs or procedures. This use of NLP may uncover safety signals and adverse reactions, that may not yet be reported through conventional channels, offering a broader safety net beyond traditional monitoring methods.

Regulatory Submissions

Regulatory submissions are a resource-intensive phase of clinical trials. AI can play a pivotal role here by generating drafts based on predefined templates and regulatory requirements.

The integration of Copilot in the regulatory submission process emerges as a significant trend for 2024, with its potential to streamline operations and improve compliance significantly. Using Copilot marks a pivotal moment for healthcare companies, as a transformative tool, akin to a dedicated, highly skilled regulatory assistant.

Navigating AI Challenges and Ethics in Clinical Trials

Despite its numerous benefits, the integration of AI into clinical trials is not without its challenges. Ensuring the privacy and security of patient data is paramount, given the sensitive nature of health data making it essential to establish robust protocols and systems to protect this information from unauthorized access or breaches.

At present, ML models are like a ghost in a machine. There’s an element of obscurity in how they process and conclude, which, although not malevolent, does present a significant challenge. This metaphorical 'ghost' represents the complex, often opaque algorithms that operate within our clinical systems. Our primary task is to demystify these processes, to illuminate their inner workingsd, ensuring that their decision-making is as transparent and understandable as possible. This step is not just a technical necessity; it's an ethical imperative, essential for the advancement and efficacy of AI in clinical trials.

Furthermore, the potential for bias in AI algorithms must be addressed. If the data used to train these algorithms reflects existing biases, it may lead to unfair treatment or exclusion of certain patient populations.

Ethical considerations also arise when making decisions based on AI-generated insights, as it is crucial to strike a balance between the benefits brought by AI and the ethical responsibilities towards patients.

Conclusion

The potential of AI to streamline clinical trials is immense, offering to cut costs, reduce timelines, and improve outcomes. This article has outlined the transformative impact of AI across various stages of the trial process, from protocol design to patient recruitment, data analysis, and regulatory compliance. As AI technology continues to advance, its applications within clinical trials are set to expand, further underlining the need for a strategic and ethical approach to its integration.

 


Gartner, Hype Cycle for Life Science Clinical Development, 2023, Jeff Smith, 21 July 2023. GARTNER and HYPE CYCLE are registered trademarks of Gartner, Inc. and/or its affiliates and are used herein with permission. All rights reserved. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.