Abstract
We evaluated the interobserver reliability and intraobserver reproducibility of the Lichtman et al. classification for Kienböck’s disease by getting four observers with different experience to look at 70 sets of wrist radiographs at different points in time. These observers staged each set of radiographs. Paired comparisons of the observations identified an agreement in 63% of cases and a mean weighted kappa coefficient of 0.64 confirming interobserver reliability. The stage of the involved lunate was reproduced in 78% of the observations with a mean weighted kappa coefficient of 0.81 showing intraobserver reproducibility. This classification for Kienböck’s disease has good reliability and reproducibility.
Get full access to this article
View all access options for this article.
