Member-only story

Hands-on Tutorial

Cohen’s Kappa and Fleiss’ Kappa— How to Measure the Agreement Between Raters

Deep understanding about Cohen’s Kappa and Fleiss’ Kappa on how to measure the agreement between raters

Audhi Aprilliant
5 min readJul 10, 2021

--

After reading this short tutorial, you will understand the calculation of Cohen’s Kappa and Fleiss’ Kappa. Further, you also can inference the result and assign the level of agreement between raters.

Cohen’s Kappa

Cohen’s Kappa is a metric used to measure the agreement of two raters. For instance, for two raters, they are asked to give 3 labels (A, B, or C) for 10 participants based on the participant’s skills. Using Cohen’s Kappa, we can measure the level of agreement. Theoretically, Cohen’s Kappa is often used:

  • To measure the level of agreement between two raters on classifying the objects into a given groups or labels
  • To measure the agreement of new method with the existing one

Cohen’s Kappa is only used for two raters

To get the Cohen’s Kappa, the following formula is used.

The formula is estimated.

Cohen’s Kappa scores can be interpreted as follows.

--

--

Audhi Aprilliant
Audhi Aprilliant

Written by Audhi Aprilliant

Data Scientist. Tech Writer. Statistics, Data Analytics, and Computer Science Enthusiast. Portfolio & social media links at http://audhiaprilliant.github.io/

Responses (1)