等级间信度
检查表
统计的
科恩卡帕
卡帕
置信区间
医学
统计
复制
心理学
临床心理学
数学
评定量表
几何学
认知心理学
作者
Damian Hoy,Peter Brooks,Anthony D. Woolf,Fiona Blyth,Lyn March,Chris Bain,Peter Baker,Emma Smith,Rachelle Buchbinder
标识
DOI:10.1016/j.jclinepi.2011.11.014
摘要
In the course of performing systematic reviews on the prevalence of low back and neck pain, we required a tool to assess the risk of study bias. Our objectives were to (1) modify an existing checklist and (2) test the final tool for interrater agreement.The final tool consists of 10 items addressing four domains of bias plus a summary risk of bias assessment. Two researchers tested the interrater agreement of the tool by independently assessing 54 randomly selected studies. Interrater agreement overall and for each individual item was assessed using the proportion of agreement and Kappa statistic.Raters found the tool easy to use, and there was high interrater agreement: overall agreement was 91% and the Kappa statistic was 0.82 (95% confidence interval: 0.76, 0.86). Agreement was almost perfect for the individual items on the tool and moderate for the summary assessment.We have addressed a research gap by modifying and testing a tool to assess risk of study bias. Further research may be useful for assessing the applicability of the tool across different conditions.
科研通智能强力驱动
Strongly Powered by AbleSci AI