Interrater Reliability

Interrater Reliability
MethodCohen's Kappa for 2 Raters (Weights: unweighted)
Subjects274
Raters2
Agreement %97
Kappa0.905
z15.0
p-value< .001

 

kontrast1 kontrast2 n
1 0 0 218
2 0 1 3
3 1 0 5
4 1 1 48

 

Table

         kontrast2
kontrast1   0   1
        0 218   3
        1   5  48