Interrater Reliability
Interrater Reliability
Method
Cohen's Kappa for 2 Raters (Weights: unweighted)
Subjects
274
Raters
2
Agreement %
99
Kappa
0.329
z
7.35
p-value
< .001
pixels1
pixels2
n
1
0
0
269
2
1
0
4
3
1
1
1
Table
pixels2 pixels1 0 1 0 269 0 1 4 1