Interrater Reliability

Interrater Reliability
MethodCohen's Kappa for 2 Raters (Weights: unweighted)
Subjects274
Raters2
Agreement %95
Kappa0.761
z12.8
p-value< .001

 

ghost1 ghost2 n
1 0 0 237
2 0 1 11
3 1 0 2
4 1 1 24

 

Table

      ghost2
ghost1   0   1
     0 237  11
     1   2  24