Interrater Reliability

Interrater Reliability
MethodCohen's Kappa for 2 Raters (Weights: unweighted)
Subjects274
Raters2
Agreement %99
Kappa0.329
z7.35
p-value< .001

 

pixels1 pixels2 n
1 0 0 269
2 1 0 4
3 1 1 1

 

Table

       pixels2
pixels1   0   1
      0 269   0
      1   4   1