Select Page

Handbook of inter-rater reliability : How to estimate the level of agreement between two or multiple raters Gaithersburg 2001.] as an alternative compliance measure to Cohen`s kappa statistics. According to Gwet [20Gwet K. Kappa, the statistics are not satisfactory to assess the extent of the agreement between the evaluators. Stat Method Interrater Reliab Assessm 2002; 1 (6): 1-6.], the reason why kappa statistics are exposed to the paradox lies in the inadequacy of formula (3) for the calculation of the expected concordance. Cohens Kappa is the most widely used match statistic in the literature. However, under certain conditions, it is influenced by a paradox that returns distorted estimates of the statistics themselves. Guggenmoos-Holzmann I: The importance of kappa: probabilistic concepts of reliability and validity reinterpreted. J Clin Epidemiol. 1996, 49 (7): 775-782. 10.1016/0895-4356 (96) 00011-X. . .