Interrater Reliability | |||
---|---|---|---|
Method | Cohen's Kappa for 2 Raters (Weights: unweighted) | ||
Subjects | 51 | ||
Raters | 2 | ||
Agreement % | 59 | ||
Kappa | 0.393 | ||
z | 4.37 | ||
p-value | < .001 | ||
Expert1 | Expert2 | n | |
---|---|---|---|
1 | N | N | 12 |
2 | N | S | 8 |
3 | P | P | 7 |
4 | S | P | 13 |
5 | S | S | 11 |
Expert2 Expert1 N P S N 12 0 8 P 0 7 0 S 0 13 11