Interrater Reliability
Interrater Reliability
Method
Cohen's Kappa for 2 Raters (Weights: unweighted)
Subjects
274
Raters
2
Agreement %
100
Kappa
0.975
z
16.1
p-value
< .001
drugo1
drugo2
n
1
0
0
252
2
1
0
1
3
1
1
21
Table
drugo2 drugo1 0 1 0 252 0 1 1 21