We believe math and science education should be free and accessible to everyone. Why education matters >

Cohen's Kappa Calculator

Enter a 2x2 agreement matrix for two raters to calculate Cohen's kappa, which measures agreement beyond what would be expected by chance. The calculator also reports standard error and the Landis & Koch agreement strength.

What is Cohen's Kappa?

Cohen's kappa (κ) quantifies the level of agreement between two raters who each classify items into two categories. Unlike simple percent agreement, kappa accounts for the possibility that raters agree by chance.

Formula:

κ = (Po - Pe) / (1 - Pe)

  • Po = observed proportion of agreement = (a + d) / N
  • Pe = expected proportion of agreement by chance

Interpreting Kappa (Landis & Koch, 1977):

Kappa Strength of Agreement
< 0.00 Less than chance
0.00 - 0.20 Slight
0.21 - 0.40 Fair
0.41 - 0.60 Moderate
0.61 - 0.80 Substantial
0.81 - 1.00 Almost perfect

When to use Cohen's Kappa:

  • Two raters classifying the same subjects into two categories
  • Assessing diagnostic agreement between clinicians
  • Evaluating coding reliability in content analysis
  • Quality control where two inspectors rate pass/fail

For more than two raters, use Fleiss' kappa. For ordinal categories, consider weighted kappa.

Did this solve your problem?

Frequently Asked Questions

Search Calculators

Search across all calculator categories