Interrater Reliability

Interrater Reliability
MethodCohen's Kappa for 2 Raters (Weights: unweighted)
Subjects51
Raters2
Agreement %51
Kappa0.318
z4.27
p-value< .001

 

Expert3 Expert2 n
1 N N 12
2 N P 11
3 N S 12
4 P P 7
5 S P 2
6 S S 7

 

Table

       Expert2
Expert3  N  P  S
      N 12 11 12
      P  0  7  0
      S  0  2  7