Interrater Reliability | |||
---|---|---|---|
Method | Cohen's Kappa for 2 Raters (Weights: unweighted) | ||
Subjects | 274 | ||
Raters | 2 | ||
Agreement % | 95 | ||
Kappa | 0.761 | ||
z | 12.8 | ||
p-value | < .001 | ||
ghost1 | ghost2 | n | |
---|---|---|---|
1 | 0 | 0 | 237 |
2 | 0 | 1 | 11 |
3 | 1 | 0 | 2 |
4 | 1 | 1 | 24 |
ghost2 ghost1 0 1 0 237 11 1 2 24