Flaws in popular research method exposed
- Date:
- February 10, 2011
- Source:
- University of Leicester
- Summary:
- Influential studies into subjects such as the safety and effectiveness of medicines or class size in schools could be called into question by a new report into ways of identifying research bias.
- Share:
Influential studies into subjects such as the safety and effectiveness of medicines or class size in schools could be called into question by a new report into ways of identifying research bias.
The report by a leading statistician identifies the danger of relying solely on published work during systematic reviews of literature -- a common approach to research worldwide, which is often used to inform public policy.
Such literature reviews may be flawed because research with positive findings is more likely to be published than work that is inconclusive or disproves a hypothesis, says Alex Sutton, the Professor of Medical Statistics at the University of Leicester.
Using extensive data on trials into the efficacy of anti-depressants, Professor Sutton has evaluated statistical tools he developed to identify and compensate for the missing findings.
Each year, millions of pounds are spent on systematic reviews of evidence on a wide range of issues of interest to policy makers and academics. But according to Professor Sutton, from Leicester's Department of Health Sciences, the selection of material for publication and the way it is presented can lead to bias in such studies.
"Perhaps the greatest threat to the validity of a systematic review is the threat of publication bias," he says.
In his recent Inaugural Lecture entitled "Analysing the data you haven't got," Professor Sutton illustrated how he developed tools for identifying and quantifying bias in systematic reviews through work done on anti-depressants.
When he compared the research findings submitted to the US Food and Drug Administration (FDA) with results of the same trials published in scientific journals, he found many trials with less favourable results that had not been published. In other trials, the reporting of findings was biased towards positive results.
The statistical tools Professor Sutton developed performed very well in identifying and correcting the bias in the journal data.
"This gives confidence that the tools will be beneficial in topics where gold standard data is not available," he said.
"My work has been in the field of pharmaceutical products because I am in the School of Medicine, but the same bias can affect systematic reviews of published material in any sphere, be it the effect of class size in schools or the impact of divorce on children."
Story Source:
Materials provided by University of Leicester. Note: Content may be edited for style and length.
Cite This Page: