Clinical research is well on its way to transforming its paper-driven model to an all things electronic format. During the past year, the clinical trial industry has made considerable progress in adopting technology as a way to streamline data collection, transmission, and monitoring.This article focuses on the top eClinical trends of 2015 and beyond.
Among the latest developments- adoption rates are higher for electronic data capture (EDC), electronic source data (eSource), and eClinical integration, as the focus is now on capturing real-time data as a continuous stream. These trends are partially the result of high-tech devices, sensors and wearables entering the clinical trial industry, as well as the FDA embracing technology and opening up a dialogue with experts on how to best channel this revolution in order to advance clinical research.
The move to conduct clinical trials from paper to electronic data capture (EDC) has accelerated over the past 10 years in an overall effort to increase data quality, regulatory compliance and to reduce cost. This trend has grown because of the need to share real-time data and facilitate strategic decisions to be made during the study based on its progress.
According to a newly released report, the healthcare cloud computing market is expected to grow from $3.73 billion in 2015 to $9.48 billion in 2020. The eClinical solutions market, including cloud-based solutions, is projected to grow 14% by 2020, reaching an estimated $6.52 billion, up from $3 billion in 2014.
Different sources of data present many data management challenges, which is why cloud solutions are quickly gaining popularity. Cloud-based technology brings efficiency and cost-effectiveness in managing clinical data, and works for both pharma companies and their Clinical Research Organizations (CROs). Utilizing cloud infrastructure scales and streamlines data, improving its quality and allowing for a simple, seamless experience.
According to a recent report by Industry Standard Research (ISR), in 2013 two providers accounted for more than 50% of EDC service. This year, five EDC providers accounted for over 50% of the market share, which shows that the market for these services is growing. The same report also shows that EDC has become standard practice with approximately 88% of Phase 3 clinical trials initiating use of the technology.
However, as recent Clinical Ink research points out, volumes of paper still delay clinical trials due to reliance on 100% source document verification (SDV). Risk-based monitoring (RBM) is also advancing at a high pace while sponsors and study teams don’t have the right eClinical solutions to generate real-time data. This explains the emergence of the next trend on our list- electronic source documentation (eSource).
Two years ago, in an effort to move away from paper inefficiencies, the U.S. Food and Drug Administration (FDA) issued its final guidance on Electronic Source Data in Clinical Investigations. In this guidance, the agency promotes capturing source data in electronic form to assist in ensuring the reliability, quality, integrity and traceability of data from electronic source to electronic regulatory submission.
According to the FDA's eSource Guidance of 2013: “Electronic source data are data initially recorded in electronic format. They can include information in original records and certified copies of original records of clinical findings, observations, or other activities captured prior to or during a clinical investigation used for reconstructing and evaluating the investigation.” In other words, this is data that is entered directly into a digital format without having to first record it on paper and then transfer it to an electronic data capture solution.
Investigators like the flexibility and versatility of pen and paper, and they perceive computerized systems as a drain on their productivity. The Internet is not always easily accessible from the clinical sites, especially overseas. This is why new eSource solutions are built on tablets that can address these two hurdles. Tablet applications are designed to “look and feel” just like paper, but they offer the efficiency of an electronic document. Unlike case report forms (CRFs), which only capture the data necessary for analysis, eSource documents encompass the much broader goal of providing affirmative documentary evidence related to a subject case history and site audit, and allow for random, ad-hoc comments.
Other benefits of eSource documents include increase in clinical data quality through validation checks and the removal of unnecessary duplication of data, as well as the reduction of monitor site visits by eliminating source document verification (SDV) and enabling remote document review. However, despite the many benefits, esourced documents can still be challenged from a GCP compliance perspective.
One way for e-source solutions to comply with regulations and guidelines is to make the first data recording on paper or keep the source data in the clinical investigator’s control by entering it in a medical record or a medical record system. The FDA doesn’t regulate electronic health record systems (EHRs), therefore it is not subject to 21 CFR Part 11 requirements. Collected data can be entered into eCRFs directly on the condition that it meets all regulations. If the clinical data is transferred to an eCRF from an EHR, then that EHR is considered the source. The FDA has made it clear that clinical trial monitors and auditors should have access to verify the data in the EHR.
Electronically collected data can be kept on or off-site. On-site storage can present many logistics challenges such as data corruption or loss, SOPs, software validation plan, restricted access and many others. Data not store locally should be under the control of the investigator in order for it to be compliant. Thin-client architecture, which delivers e-sourced data straight into the CRO’s remote server, can sometimes also be GCP non-compliant.
The FDA has made substantial efforts in supporting the use of electronic data solutions in the past couple of years. Among the many benefits, eSourcing helps control fraud as it is far more difficult to fabricate electronic records compared to paper ones.
A 2015 report by Cutting Edge Information shows that adoption of electronic trial master file (eTMF) is expected to reach 88% by 2020. Currently, only about 54% of TMF is electronic-based. The report also indicates that updating paper documents to an electronic platform is more time-consuming than building new TMFs in an electronic system. To overcome the challenges associated with eTMF platforms, many surveyed teams reported executing eTMF strategies in waves. For example, teams may start by building new TMFs into an electronic system as part of a paperless pilot program, before updating paper documents from older studies.
Another recent study done by an eTMF provider also claims a drastic spike in eTMF adoption this year. The Veeva 2015 Paperless TMF Study (Veeva is the provider of Vault eTMF) surveyed 50 international CROs and found that 38% use eTMF applications in comparison to 21% just one year ago—a sudden, 17 percentage point increase, striking especially for a market that traditionally moves more gradually. The same report also claims that, as compared to 2014, greater numbers of CROs now exchange TMF documents with sponsors via eTMF applications (36% today, up from 24%), and are much less reliant on paper (46%, down from 65%).
Recent trends in RBM were discussed at the November 2015 CBI Conference on risk-based trial management. Among the topics were the changing roles of study monitors, as well as the way RBM is changing how clinical trials are conducted. Even though some CRO and sponsors aim for 100% SDV, the path to RBM has been forged with many new technology companies addressing RBM in the last couple of years.
Recent trends indicate that sponsors are comfortable outsourcing source data verification and monitoring visits to CROs but they prefer to insource Clin Ops and data management so they can have more real-time control over the study, and mitigate any fallouts as they happen.
The FDA recently developed a risk-based site selection tool which collects NDAs by clinical investigator sites and allows the agency to use the stratified data in order to select site for GCP inspections.
Experts point out that as more companies are executing RBM, clinical trial teams should differentiate between the critical endpoints of study protocol and monitoring setup, and the clinical data that will reflect the critical safety and efficacy endpoint. The value of RBM would not be leveraged if all data would otherwise be treated equally instead of being classified.
Another useful piece of advice is to pay attention to data trails and the changes made, having 100% QC of any modifications. The recent trend in higher RBM adoption has adapted the monitoring role to keep track of any changes the study team and sites are making. Experts say that is the way for people, processes and technology to complement each other when using RBM in a clinical trial.
The FDA defines electronic consent as "using electronic systems and processes that may employ multiple electronic media (e.g., text, graphics, audio, video, podcasts and interactive web sites, biological recognition devices, and card readers) to convey information related to the study and to obtain and document informed consent."
There are several eSignature and eConsent systems currently available on the market. In addition, companies such as Apple are also entering the medical research market with apps and wearable technology. For example, Apple's ResearchKit has a module for building electronic consent forms.
A recent survey of the top 50 pharma companies shows that about 66% of them are either using eConsent or planning to in the near future. The percentile is even higher among the top 25 companies on the list- 88% of them have implemented eConsent. 100% of the top ten companies have also put eConsent initiatives in place.
Technology is also placing the patient at the epicenter of clinical research. Fast-developing ePRO technologies allow patients to report clinical data themselves. The modern ePRO systems are designed to maximize the ease in which patients report their observations. Additionally, they better integrate with eClinical systems to capture and spread relevant clinical data faster to clinical teams. For example, ePRO systems integrate with electronic data capture (EDC) systems to automatically and securely import clinical data from the ePRO directly into the EDC system. This allows a quicker response time in the case of adverse events for example.
In a recent survey, examining 22 sponsors and CROs, 18 reported having adopted ePRO which resulted in increased data quality, patient compliance and efficient data collection. 61% of the surveyed companies indicated they implemented ePRO in the last five years, 28% in the last 10 years, and 11% over 10 years ago. Experts say the increased emphasis placed on patient reported outcomes and the push for technology adoption in clinical trials has resulted in significant increase in ePRO use. The Tufts survey also points out there's been an overwhelming increase in ePRO usage in the oncology field. The main drawback has been the cost of using ePRO compared to paper.
However, the higher ePRO adoption trend is likely to continue as more companies see its value for post-marketing trials, as well as its benefits and capabilities expanding as more vendors enter the space each year.
With increased regulatory requirements and the trend towards personalized medicine, sponsor companies and CROs need to access more specific solutions to meet their need, making systems integration an increasing necessity for a successful clinical trial. In addition, risk management of the product’s life cycle includes investigators, regulators and patients. This is where systems integration comes in; ensuring data is more accurate and consistent.
Leveraging technology to optimize speed, quality and cost of clinical trials is a big hurdle for pharma and their CRO partners. Bringing drugs and medical devices to market faster is most important for business success. CROs are quickly realizing that in order to remain competitive, they need the IT infrastructure to accommodate an influx of clinical data that would be well-organized and easily accessible from a central data warehouse.
This warehouse should handle the integration, reporting, management, visualization and analysis of all clinical data. For example, an integrated system, comprised of custom clinical trial management systems (CTMS), Pharmacovigilance, EDC and a Clinical Data Interchange Standards Consortium (CDISC)-compliant data warehouse enables the timely analysis of clinical data. Traditional integration between EDC, CTMS, clinical data repositories (CDR), clinical data management systems (CDMS) and statistical analysis systems (SAS) may require a lot of manual data sharing. While many sponsors can afford to transcribe data in the right format before sending it to their CROs, smaller companies still struggle to prepare their data for FDA submission.
That is why integration is crucial for both clinical trial sponsors and CROs to exchange data during all trial phases. Big pharma reportedly spends close to $200 million annually for data transfer. But new trends are emerging to combat the old ways of not transferring data until all collection is done. More and more trials are now conducted with the data moving in the early phases. This method allows managers to spot and ferret out potential problems, thus saving money and time. Another trend which saves time and money is following the CDISC’s Clinical Data Acquisition Standards Harmonization (CDASH) data submission fields. This saves companies from not having to restructure their data in the drug-approval process. CROs must also follow data aggregation formats such as the Standard Data tabulation Model (STDM).
Whether you choose EDC & CTMS, eTMF & Safety, or EHR integration, there is no one-stop-shop solution. For example, CTMS solutions such as Advanced Clinical Software’s Study Manager have been installed at over 2,000 sites but there are still no defined metadata and communication standards that allow CTMS and EDC solutions to share data. A common issue with EDC-CTMS integration occurs when there are complex investigative site business practices. Most EDC systems only capture clinical trial data through eCRFs that lack CTMS information. Another issue is that some EDCs may lack timeline planning features such as reaching target subject recruitment milestones, for instance. As for eTMF & Safety integration, common issue here is the lack of real-time inspection and ICH/GCP compliance.
One way for improving this process is to focus on data analysis, not just warehousing it. Most companies only focus on front-end integration without considering the need to generate reports for regulators later. If data were integrated from the start, it would be easily accessible at any point. However, this is easier said than done, as implementing systems integration is estimated to cost about $500K and take as much as 3-6 months, which can come up to nearly 10% of the research budget.
Collaboration and consolidation among front-end and back-end systems, as well as the emergence of advanced eClinical systems or modules, shows that the value of integrating will only continue to grow as users see the efficiency in storing and viewing their data on a single interface.
Last December the FDA published a landmark package of Guidances, specifications and other documents governing electronic submissions. These have the force of law, which in effect made the use of CDISC standards mandatory in the United States and Japan by December, 2016 .
The FDA Guidances establish the framework for the requirement of standardized study data in submissions, and cover most aspects of submission data and documentation.
The new mandate adds on pressure for CROs and sponsors to conform. In addition, the FDA is also considering updating the CDASH standards, which would also have a significant impact on current clinical trial processes.
A paradigm shift is taking place in the oncology clinical trial space, partially as a result of the Obama Administration launching its “Precision Medicine Initiative” earlier this year. Precision medicine is an innovative approach that takes into account individual differences in people’s genes, environments, and lifestyles.
According to a White House release, a $215 million investment in the President’s 2016 Budget will be allocated to the Precision Medicine Initiative to pioneer this patient-powered research and provide clinicians with new tools, knowledge and therapies to select the treatments that work best for their patients.
The funding will be spread out between the National Institutes of Health (NIH), the Office of the National Coordinator for Health Information Technology (ONC), and the FDA.
The objective for the National Cancer Institute (NCI) is to accelerate the design and testing of tailored treatments for cancer by expanding genetically based clinical cancer trials. In June of this year, the NCI announced the launch of its nationwide clinical trial, utilizing DNA sequencing. In other words, subjects are grouped based on similarity in their genetic mutations, not the location of their cancer. The grouping is also known as “basket trials”. In the study, a few thousand patients at 2,400 sites throughout the United States will be sorted out into over a dozen treatments based on their tumor’s mutation.
The American Society of Clinical Oncology also recently announced the launch of a project that will provide patients with drugs targeting similar molecular abnormalities, and collect the data from their oncologists in order to monitor the effectiveness of the treatments.
The National Institutes of Health (NIH), in collaboration with other agencies and stakeholders, will launch a national, patient-powered research cohort of over a million Americans who volunteer to participate in research.
The trial subjects will be involved in the design of the Initiative and will have the opportunity to contribute various data—including medical records; profiles of the patient’s genes, metabolites (chemical makeup), and microorganisms in and on the body; environmental and lifestyle data; patient-generated information; and personal device and sensor data.
The Initiative will include reviewing the current regulatory landscape to determine whether changes are needed to support the development of this new research and care model, including its critical privacy and participant protection framework. As part of this effort, the FDA will develop a new approach for evaluating Next Generation Sequencing technologies — tests that rapidly sequence large segments of a person’s DNA, or even their entire genome.