Interrater Reliability

Interrater Reliability
MethodCohen's Kappa for 2 Raters (Weights: unweighted)
Subjects274
Raters2
Agreement %100
Kappa0.975
z16.1
p-value< .001

 

drugo1 drugo2 n
1 0 0 252
2 1 0 1
3 1 1 21

 

Table

      drugo2
drugo1   0   1
     0 252   0
     1   1  21