TUESDAY, May 31 (HealthDay News) -- Biomarker effects are often overestimated in highly cited studies compared to the effects reported in subsequent meta-analyses of the same associations, according to a review published in the June 1 issue of the Journal of the American Medical Association.
John P.A. Ioannidis, M.D., D.Sc. from the Stanford University School of Medicine in California, and Orestis A. Panagiotou, M.D., from the University of Ioannina School of Medicine in Greece, investigated the accuracy of biomarker-effect sizes in 35 highly cited studies. Biomarker studies that received more than 400 citations in the ISI Web of Science until December 2010 and were published in one of 24 highly cited biomedical journals were included in the analysis. The effect size of each biomarker in the highly cited studies (based on the relative risk reported in the abstract) was compared with the relative risk in the largest study and the overall relative risk from the corresponding meta-analysis on the same biomarker and same outcome.
The investigators found that, for 30 of the associations, the effect estimate was stronger in the highly cited study than in the largest study. In three cases, the largest study was the highly cited study. In two cases, the largest study had a stronger effect estimate than the highly cited study. For 29 of the 35 highly cited studies, the effect size was smaller in the corresponding meta-analysis. Based on the largest studies, only 15 associations were found to be nominally statistically significant, and seven had a relative risk point estimate of more than 1.37.
"Results in highly cited biomarker studies often significantly overestimate the findings seen from meta-analyses," the authors write.
Full Text (subscription or payment may be required)
Editorial (subscription or payment may be required)