Interrater Reliability

Interrater Reliability
MethodCohen's Kappa for 2 Raters (Weights: unweighted)
Subjects51
Raters2
Agreement %59
Kappa0.393
z4.37
p-value< .001

 

Expert1 Expert2 n
1 N N 12
2 N S 8
3 P P 7
4 S P 13
5 S S 11

 

Table

       Expert2
Expert1  N  P  S
      N 12  0  8
      P  0  7  0
      S  0 13 11