Interrater Reliability | |||
---|---|---|---|
Method | Cohen's Kappa for 2 Raters (Weights: unweighted) | ||
Subjects | 274 | ||
Raters | 2 | ||
Agreement % | 95 | ||
Kappa | 0.757 | ||
z | 12.5 | ||
p-value | < .001 | ||
linije1 | linije2 | n | |
---|---|---|---|
1 | 0 | 0 | 231 |
2 | 0 | 1 | 8 |
3 | 1 | 0 | 7 |
4 | 1 | 1 | 28 |
linije2 linije1 0 1 0 231 8 1 7 28