Interrater Reliability | |||
---|---|---|---|
Method | Cohen's Kappa for 2 Raters (Weights: unweighted) | ||
Subjects | 51 | ||
Raters | 2 | ||
Agreement % | 51 | ||
Kappa | 0.318 | ||
z | 4.27 | ||
p-value | < .001 | ||
Expert3 | Expert2 | n | |
---|---|---|---|
1 | N | N | 12 |
2 | N | P | 11 |
3 | N | S | 12 |
4 | P | P | 7 |
5 | S | P | 2 |
6 | S | S | 7 |
Expert2 Expert3 N P S N 12 11 12 P 0 7 0 S 0 2 7