Kappa on a single item Ksi is proposed as a measure of the interrater agreement when a single item or object is rated by multiple raters. A statistical test and Monte Carlo simulations are provided for testing the statistical significance of Ksi beyond chance agreement.
Get full access to this article
View all access options for this article.
References
1.
ArmitageP. (1980) Statistical methods in medical research. London: Blackwell Scientific Publ. P. 115.