Interrater Reliability | |||
---|---|---|---|
Method | Cohen's Kappa for 2 Raters (Weights: unweighted) | ||
Subjects | 51 | ||
Raters | 2 | ||
Agreement % | 71 | ||
Kappa | 0.532 | ||
z | 5.99 | ||
p-value | < .001 | ||
Expert1 | Expert3 | n | |
---|---|---|---|
1 | N | N | 20 |
2 | P | P | 7 |
3 | S | N | 15 |
4 | S | S | 9 |
Expert3 Expert1 N P S N 20 0 0 P 0 7 0 S 15 0 9