New! Sign up for our free email newsletter.
Science News
from research organizations

Re-analysis of clinical trial data can change conclusions in one-third of studies

Date:
September 9, 2014
Source:
Stanford University Medical Center
Summary:
As many as one-third of previously published randomized clinical trials could be re-analyzed in ways that modify the conclusions of how many or what types of patients need to be treated, according to a new study.
Share:
FULL STORY

As many as one-third of previously published randomized clinical trials could be re-analyzed in ways that modify the conclusions of how many or what types of patients need to be treated, according to a new study by researchers at the Stanford University School of Medicine.

A culture that fails to encourage data sharing makes such re-analysis of the data extremely rare, the researchers said. They were able to identify only 37 published re-analyses over more than three decades of research. Of these, only five were conducted by researchers who were not associated with the original studies.

The new study will be published Sept. 9 in the Journal of the American Medical Association.

"There is a real need for researchers to provide access to their raw data for others to analyze," said John Ioannidis, MD, DSc, professor of medicine and director of the Stanford Prevention Research Center. "Without this access, and possibly incentives to perform this work, there is increasing lack of trust in whether the results of published, randomized trials are credible and can be taken at face value. The recent hot debates about whether oseltamivir works are only the tip of the iceberg in this crisis of confidence."

Oseltamivir is an antiviral medication marketed under the trade name Tamiflu. Although it is licensed to treat influenza A and influenza B, some subsequent analyses and trials conducted after the drug was approved have suggested that its benefits do not outweigh the risks of side effects in otherwise healthy adults.

Ioannidis is the senior author of the study. Postdoctoral scholar Shanil Ebrahim, PhD, is the lead author. Ioannidis is co-director of the recently launched Meta-Research Innovation Center at Stanford, or METRICS, which aims to advance excellence in scientific research by evaluating and optimizing scientific practices. Enhancing reproducibility and data sharing could be instrumental in this regard.

Ebrahim and his colleagues used the MEDLINE database to conduct their study. MEDLINE is a bibliographic database maintained by the National Library of Medicine. It contains over 25 million citations of biomedical publications from roughly 5,600 journals worldwide. They searched for articles written in English describing the re-analysis of raw data used in previously published studies. Meta-analyses were excluded from the study, as were studies testing a different hypothesis than the original trial.

The researchers screened nearly 3,000 articles of potential interest and read the full text of 226. Of these, 38 were deemed eligible for their study. Two were subsequently excluded because the articles describing the original clinical trials on which they were based were unavailable, and one contained two re-analyses. Of these 37 re-analyses evaluated for the study, 32 had an overlap of at least one author from the original paper.

Thirteen of the re-analyses (35 percent of the total) came to conclusions that differed from those of the original trial with regard to who could benefit from the tested medication or intervention: Three concluded that the patient population to treat should be different than the one recommended by the original study; one concluded that fewer patients should be treated; and the remaining nine indicated that more patients should be treated.

The differences between the original trial studies and the re-analyses often occurred because the researchers conducting the re-analyses used different statistical or analytical methods, ways of defining outcomes or ways of handling missing data. Some re-analyses also identified errors in the original trial publication, such as the inclusion of patients who should have been excluded from the study.

The aims of the re-analyzed studies varied widely. For example, one study on the treatment of enlarged, bleeding veins in the esophagus concluded that sclerotherapy, in which physicians use an endoscope to inject the veins with chemicals to induce blood clots, reduced mortality even though it didn't prevent rebleeding. The re-analysis, which used a different statistical model of risk, concluded the treatment did prevent rebleeding but didn't reduce mortality. The new conclusion suggested that the intervention would be best given to patients with rebleeding, rather than those at highest risk of death from the condition.

Another study investigated the best way to deliver a medication to stimulate the production of red blood cells in people with anemia by comparing a fixed dose administered once every three weeks with weight-based weekly dosing. In the re-analysis, the conclusion changed when investigators used an updated hemoglobin threshold level to determine when therapy should be initiated.

"The high proportion of re-analyses reaching different conclusions than the original papers may be partly an artifact," said Ioannidis, who is also the C.F. Rehnborg Professor in Disease Prevention. "By that I mean that, in the current environment, re-analyses that reach exactly the same results as the original would have great difficulty getting published. However, making the raw data of trials available for re-analyses is essential not only for re-evaluating whether the original claims were correct, but also for using these data to perform additional analyses of interest and combined analyses." In this way, existing raw data could be used to explore new clinical questions, and may sometimes eliminate the need to conduct new trials.

The fact that researchers conducting re-analyses often came to different conclusions doesn't indicate the original studies were necessarily biased or deliberately falsified, Ioannidis added. Instead, it emphasizes the importance of making the original data freely available to other researchers to encourage dialogue and consensus, and to discourage a culture of scientific research that rewards scientists only for novel or unexpected results.

"I am very much in favor of data sharing, and believe there should be incentives for independent researchers to conduct these kinds of re-analyses," said Ioannidis. "They can be extremely insightful."


Story Source:

Materials provided by Stanford University Medical Center. Original written by Krista Conger. Note: Content may be edited for style and length.


Journal Reference:

  1. Shanil Ebrahim, Zahra N. Sohani, Luis Montoya, Arnav Agarwal, Kristian Thorlund, Edward J. Mills, John P. A. Ioannidis. Reanalyses of Randomized Clinical Trial Data. JAMA, 2014; 312 (10): 1024 DOI: 10.1001/jama.2014.9646

Cite This Page:

Stanford University Medical Center. "Re-analysis of clinical trial data can change conclusions in one-third of studies." ScienceDaily. ScienceDaily, 9 September 2014. <www.sciencedaily.com/releases/2014/09/140909192038.htm>.
Stanford University Medical Center. (2014, September 9). Re-analysis of clinical trial data can change conclusions in one-third of studies. ScienceDaily. Retrieved October 30, 2024 from www.sciencedaily.com/releases/2014/09/140909192038.htm
Stanford University Medical Center. "Re-analysis of clinical trial data can change conclusions in one-third of studies." ScienceDaily. www.sciencedaily.com/releases/2014/09/140909192038.htm (accessed October 30, 2024).

Explore More

from ScienceDaily

RELATED STORIES