作者
Enrique Vázquez,Joseph S. Ross,Cary P. Gross,Karla Childers,S. P. Bamford,Jessica Ritchie,Joanne Waldstreicher,Harlan M. Krumholz,Joshua D. Wallach
摘要
Background/Aims The reuse of clinical trial data available through data-sharing platforms has grown over the past decade. Several prominent clinical data-sharing platforms require researchers to submit formal research proposals before granting data access, providing an opportunity to evaluate how published analyses compare with initially proposed aims. We evaluated the concordance between the included trials, study objectives, endpoints, and statistical methods specified in researchers’ clinical trial data use request proposals to four clinical data-sharing platforms and their corresponding publications. Methods We identified all unique data request proposals with at least one corresponding peer-reviewed publication as of 31 March 2023 on four prominent clinical trial data sharing request platforms (Vivli, ClinicalStudyDataRequest.com, the Yale Open Data Access Project, and Supporting Open Access to Researchers–Bristol Myers Squibb). When data requests had multiple publications, we treated each publication–request pair as a unit. For each pair, the trials requested and analyzed were classified as fully concordant, discordant, or unclear, whereas the study objectives, primary and secondary endpoints, and statistical methods were classified as fully concordant, partially concordant, discordant, or unclear. For Vivli, ClinicalStudyDataRequest.com, and Supporting Open Access to Researchers–Bristol Myers Squibb, endpoints of publication–request pairs were not compared because the data request proposals on these platforms do not consistently report this information. Results Of 117 Vivli publication–request pairs, 76 (65.0%) were fully concordant for the trials requested and analyzed, 61 (52.1%) for study objectives, and 57 (48.7%) for statistical methods; 35 (29.9%) pairs were fully concordant across the 3 characteristics reported by all platforms. Of 106 ClinicalStudyDataRequest.com publication–request pairs, 66 (62.3%) were fully concordant for the trials requested and analyzed, 41 (38.7%) for study objectives, and 35 (33.0%) for statistical methods; 20 (18.9%) pairs were fully concordant across the 3 characteristics. Of 65 Yale Open Data Access Project publication–request pairs, 35 (53.8%) were fully concordant for the trials requested and analyzed, 44 (67.7%) for primary study objectives, and 25 (38.5%) for statistical methods; 15 (23.1%) pairs were fully concordant across the 3 characteristics. In addition, 26 (40.0%) and 2 (3.1%) Yale Open Data Access Project publication–request pairs were concordant for primary and secondary endpoints, respectively, such that only one (1.5%) Yale Open Data Access Project publication–request pair was fully concordant across all five characteristics reported. Of three Supporting Open Access to Researchers–Bristol Myers Squibb publication–request pairs, one (33.3%) was fully concordant for the trials requested and analyzed, two (66.6%) for primary study objectives, and two (66.6%) for statistical methods; one (33.3%) pair was fully concordant across all three characteristics reported by all platforms. Conclusion Across four clinical data sharing platforms, data request proposals were often discordant with their corresponding publications, with only 25% concordant across all three key proposal characteristics reported by each platform. Opportunities exist for investigators to describe any data-sharing request proposal deviations in their publications and for platforms to enhance the reporting of key study characteristic specifications.