首页 | 本学科首页   官方微博 | 高级检索  
     


Quantifying the development of agreement among experts in Delphi studies
Authors:J.V. Meijering  J.K. Kampen  H. Tobi
Affiliation:Research Methodology Group, Social Sciences, Wageningen University, Hollandseweg 1, 6706KN, Wageningen, The Netherlands
Abstract:Delphi studies are often conducted with the aim of achieving consensus or agreement among experts. However, many Delphi studies fail to offer a concise interpretation of the meaning of consensus or agreement. Whereas several statistical operationalizations of agreement exist, hardly any of these indices is used in Delphi studies. In this study, computer simulations were used to study different indices of agreement within different Delphi scenarios. A distinction was made between the indices of consensus (Demoivre index), agreement indices (e.g., Cohen's kappa and generalizations thereof), and association indices (e.g., Cronbach's alpha, intraclass correlation coefficient). Delphi scenarios were created by varying the number of objects, the number of experts, the distribution of object ratings, and the degree to which agreement increased between subsequent rounds. Each scenario consisted of three rounds and was replicated 1000 times. The simulation study showed that in the same data, different indices suggest different levels of agreement, and also, different levels of change of agreement between rounds. In applied Delphi studies, researchers should be more transparent regarding their choice of agreement index and report the value of the chosen index within every round as to provide insight into how the suggested agreement level has developed across rounds.
Keywords:Delphi   Consensus   Agreement   Association   Methodology   Simulation
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号