New FDA draft guidance document gets real about RWD use

By Jenni Spinner

- Last updated on GMT

(filo/iStock via Getty Images Plus)
(filo/iStock via Getty Images Plus)

Related tags Fda RWD RWE Data management Clinical trials IQVIA

A representative from IQVIA discusses the particulars of the US agency’s new draft document and shares what it might mean for clinical trial data evaluation.

The US Food and Drug Administration recently issued a draft guidance entitled “Real-World Data: Assessing Electronic Health Records and Medical Claims Data To Support Regulatory Decision-Making for Drug and Biological Products.” The title is a mouthful, and the document itself tops out at 39 pages​.

Outsourcing-Pharma spoke with two experts from IQVIA, who offered some insights about the draft guidance and its potential impact on clinical trial data:

  • Matthew W. Reynolds, vice president of real-world evidence, IQVIA
  • Nancy Dreyer, chief scientific officer, IQVIA Real-World Solutions 

OSP: Could you please talk about the evolution of the FDA’s understanding/advice around the use of EHR and other medical data?

IQ: The advent of this guidance indicates that the FDA continues to realize that there is value in leveraging real-world data like electronic health records and medical claims data to gain insights on the usage, safety, and effectiveness of drugs and biological products. Opportunities for real-world data usage have grown rapidly during the pandemic when there was an urgent need for public health knowledge from reliable, near-real-time data analytics.

The FDA has considered and leveraged real-world data in their decision-making historically, but it has always been on a case-by-case basis via individual project justification. One prior guidance existed from 2013, but it was focused on safety, whereas this guidance expands to safety and effectiveness and dives much deeper into guiding principles for optimal study approaches.

This draft guidance sets the stage for some guiding principles on good practices for assessment of real-world data (RWD) acceptability and usage for future regulatory opportunities. The guidance lays out what the FDA considers to be important in the evaluation of EHR and health insurance claims data and the assessment of whether a specific RWD source is fit for purpose in meeting the needs of the specific research or study objectives. It drives not only describing a dataset but understanding of why it was created, whom it represents, how long a time period is likely covered for study subjects, and the completeness of information for elements of study-specific interest.

OSP: Please share a summary of what’s new in this latest new guidance—feel free to talk about gaps in previous guidance docs that this seeks to correct, and other ways this represents a way forward.

IQ: This guidance provides insights into how to evaluate data sources to appropriately address regulatory study questions. It provides guidance on the development and validation of definitions for study design elements (e.g., exposure, outcomes, covariates) as well as considerations of data provenance and quality during data accrual, data curation, and into the final study dataset. It is not a document that provides specific recommendations on any specific type of data source, methodology, study design, or statistical analysis, keeping the door open for the evolution of methods and tools.

Among the new ideas presented here include explaining computable phenotypes and any machine learning tools used. This guidance also emphasizes the value of data linkage recognizing that not all desired data may be available in a single source, and reliable linkage can fill some of those gaps.

OSP: Also, what role has the FDA and other regulators played in leading or starting important discussions around RWD?

IQ: The FDA has long relied on RWD to evaluate safety after new product launch, and as therapeutic interventions have become more complicated, such as the combinations and sequences of medications used in chemotherapy, FDA and other regulators such as the European Medicines Agency and China’s National Medical Products agency have come to rely on RWD to evaluate these various combinations in terms of their safety as well as their effectiveness.

The move to accurate evaluation of clinical effectiveness using RWD requires sophisticated study designs and analysis, a movement that was stimulated in part by CMS’s need to evaluate how well treatments work for their covered population, recognizing that the elderly, children, and those with many co-morbidities, are not generally studied in typical clinical trials. CMS’s interest in collaboration with the US Agency for Health Care Research and Quality led to the creation of two popular how-to guidebooks, one on patient registries and the other on “Developing a protocol for observational comparative effectiveness research,” which was recently cited as good practice guidance by both the FDA and China’s NMPA.

The FDA was also active in the drive to leverage real-world data during the COVID-19 pandemic, collaborating in Evidence Accelerators that pulled insights, RWD, and analytical results from a large spectrum of health professionals (clinicians, data providers, pharmaceutical researchers, epidemiologists, statisticians, health plans, government officials and more) via real-time weekly interactions and supported widespread participation. The goal of these meetings and collaborations was to find ways to get timely insights into how to understand, manage, and treat COVID effectively and safely using the data at hand via medical claims, electronic health records, registries, and other real-world data sources.

OSP: Do you have any advice for researchers in prepping submissions with RWD? How can they evaluate the fitness for purpose?

OSP_IQVIArwd_MR
Matthew Reynolds, VP of real-world evidence, IQVIA

IQ: Make sure that once you have a well-thought-out plan, you engage in proactive discussion on the approach and data source with the FDA. They invite and recommend this type of discussion and will provide non-binding but informative feedback about whether the plan is likely to provide reliable enough information to be considered seriously and if any gaps or potential strengthening should be considered.

Regarding the assessment of fitness for purpose, there are several key factors to consider. First, make sure you understand the data source, where it comes from, and its strengths and limitations. Be as critical as possible in your vetting, since others sure will be! Assess where data may be incomplete or less complete and consider why that might be, as well as any particular areas of weakness that could jeopardize study interpretation.

  • Does it invite bias or concerns around the generalizability and accuracy of the study results?
  • Does the data set have the level of detail that you’ll need to conduct a high-quality study of a particular medical product and outcome?
  • Can you define your cohort, key variables, and outcomes accurately and consistently with your study goals and clinical expectations?
  • Does the data set have enough of the components of data that are essential for a given study purpose (e.g., hospitalizations, mortality, chart review for outcome validation)?

Be transparent about the limitations of the data and make an honest critical assessment of whether it is sufficiently fit for purpose to advance clinical understanding for the study question.

Researchers can use this guidance to help for the assessment of fitness for purpose, but there are also other published guides that help someone walk through that process as well (e.g., Reynolds MW, Bourke A, Dreyer NA. Considerations when evaluating real-world data quality in the context of fitness for purpose​. Pharmacoepidemiol Drug Saf. 2020 Oct;29(10):1316-1318​) and Simon G, Bindman A, Dreyer N et al. When can we trust real-world data to evaluate new medical treatments?​ Clinical Pharmacology & Therapeutics 2022; 111(1):24-29, 2022/1

Use your trusted experts in real-world data to ensure you have a defensible and high-quality solution.

OSP: What tech solutions can researchers employ to help them along the way?

OSP_IQVIArwd_ND
Nancy Dreyer, chief scientific officer, IQVIA Real-World Solutions

IQ: The guidance provides several suggestions to validate the disease in the cohort of interest, the treatment itself, the key study variables and covariates, and the outcomes of interest. They recommend supplementing with clinical text and/or data linkage. While each of these may prove difficult to implement in many real-world data sources, there are many that do allow for the opportunity of digging deeper into the data and/or linking to additional datasets.

There are a number of technology approaches that are currently used by several large research organizations to accurately link separate complementary datasets (E.g., tokenization to link medical claims to EHR data or to detailed inpatient datasets). If additional textual detail is available in the EHR or via traditional medical charts, there are ways to use natural language processing, machine learning, and artificial intelligence to garner additional details to allow the creation of sophisticated phenotypes for study, a trend that will be increasingly important as we advance the science of precision medicine.

Additionally, there are numerous tools being used in decentralized clinical trials that are becoming useful in real-world research, including wearables and other digital devices as well as a variety of ways to collect and manage patient-reported data.

OSP: Anything to add?

IQ: This guidance forces both the researchers proposing the use of RWD as well as the stakeholders to thoroughly assess the data being proposed in the framework of the research initiative. It makes them dive in deeper than they might otherwise. It isn’t just a top-level assessment of “this variable is in this dataset, and we have it in-house, let’s use it…​”. It forces more of a blank slate “what is the best type of data to answer our question, why and is it accessible to us​” and then “this dataset seems like it would be a good choice, let’s pressure test it and go through a full assessment before we commit.​” 

Related news

Show more

Related products

show more

Saama accelerates data review processes

Saama accelerates data review processes

Content provided by Saama | 25-Mar-2024 | Infographic

In this new infographic, learn how Saama accelerates data review processes. Only Saama has AI/ML models trained for life sciences on over 300 million data...

More Data, More Insights, More Progress

More Data, More Insights, More Progress

Content provided by Saama | 04-Mar-2024 | Case Study

The sponsor’s clinical development team needed a flexible solution to quickly visualize patient and site data in a single location

Using Define-XML to build more efficient studies

Using Define-XML to build more efficient studies

Content provided by Formedix | 14-Nov-2023 | White Paper

It is commonly thought that Define-XML is simply a dataset descriptor: a way to document what datasets look like, including the names and labels of datasets...

Why should you use clinical trial technology?

Why should you use clinical trial technology?

Content provided by Formedix | 01-Nov-2023 | White Paper

New, innovative clinical trial technology is helping to revolutionize the research landscape. COVID-19 demonstrated that clinical trials can be run much...

Related suppliers

Follow us

Products

View more

Webinars