首页 | 本学科首页   官方微博 | 高级检索  
     


Agreement between an isolated rater and a group of raters
Authors:S. Vanbelle   A. Albert
Affiliation:Medical Informatics and Biostatistics, University of Liège, CHU Sart Tilman (B23), 4000 Liège, Belgium
Abstract:The agreement between two raters judging items on a categorical scale is traditionally assessed by Cohen's kappa coefficient. We introduce a new coefficient for quantifying the degree of agreement between an isolated rater and a group of raters on a nominal or ordinal scale. The group of raters is regarded as a whole, a reference or gold-standard group with its own heterogeneity. The coefficient, defined on a population-based model, requires a specific definition of the concept of perfect agreement. It has the same properties as Cohen's kappa coefficient and reduces to the latter when there is only one rater in the group. The new approach overcomes the problem of consensus within the group of raters and generalizes Schouten's index. The method is illustrated on published syphilis data and on data collected from a study assessing the ability of medical students in diagnostic reasoning when compared with expert knowledge.
Keywords:kappa coefficient    nominal scale    ordinal scale    expert group
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号