Interrater Reliability

Interrater Reliability
MethodCohen's Kappa for 2 Raters (Weights: unweighted)
Subjects51
Raters2
Agreement %71
Kappa0.532
z5.99
p-value< .001

 

Expert1 Expert3 n
1 N N 20
2 P P 7
3 S N 15
4 S S 9

 

Table

       Expert3
Expert1  N  P  S
      N 20  0  0
      P  0  7  0
      S 15  0  9