Assessment of Consistency between Peer-Reviewed Publications and Clinical Trial Registries

Lynn W. Sun, Daniel J. Lee, Jamie A. Collins, Timothy C. Carll, Khalid Ramahi, Scott J. Sandy, Jackson G. Unteriner, David V. Weinberg

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

Importance: Clinical trial registries are intended to increase clinical research transparency by nonselectively identifying and documenting clinical trial designs and outcomes. Inconsistencies in reported data undermine the utility of such registries and have previously been noted in general medical literature. Objective: To assess whether inconsistencies in reported data exist between ophthalmic literature and clinical trial registries. Design, Setting, and Participants: In this retrospective, cross-sectional study, interventional clinical trials published from January 1, 2014, to December 31, 2014, in the American Journal of Ophthalmology, JAMA Ophthalmology, and Ophthalmology were reviewed. Observational, retrospective, uncontrolled, and post hoc reports were excluded, yielding a sample size of 106 articles. Data collection was performed from January through September 2016. Data review and adjudication continued through January 2017. Main Outcomes and Measures: If possible, articles were matched to registry entries listed in the ClinicalTrials.gov database or in 1 of 16 international registries indexed by the World Health Organization International Clinical Trials Registry Platform version 3.2 search engine. Each article-registry pair was assessed for inconsistencies in design, results, and funding (each of which was further divided into subcategories) by 2 reviewers and adjudicated by a third. Results: Of 106 trials that met the study criteria, matching registry entries were found for 68 (64.2%), whereas no matching registry entries were found for 38 (35.8%). Inconsistencies were identified in study design, study results, and funding sources, including specific interventions in 8 (11.8%), primary outcome measure (POM) designs in 32 (47.1%), and POM results in 48 (70.6%). In addition, numerous data pieces were unreported, including analysis methods in 52 (76.5%) and POM results in 38 (55.9%). Conclusions and Relevance: Clinical trial registries were underused in this sample of ophthalmology clinical trials. For studies with registry data, inconsistency rates between published and registered data were similar to those previously reported for general medical literature. In most cases, inconsistencies involved missing data, but explicit discrepancies in methods and/or data were also found. Transparency and credibility of published trials may be improved by closer attention to their registration and reporting.

Original languageEnglish (US)
Pages (from-to)552-556
Number of pages5
JournalJAMA ophthalmology
Volume137
Issue number5
DOIs
StatePublished - May 2019
Externally publishedYes

ASJC Scopus subject areas

  • Ophthalmology

Fingerprint

Dive into the research topics of 'Assessment of Consistency between Peer-Reviewed Publications and Clinical Trial Registries'. Together they form a unique fingerprint.

Cite this