Cohen's kappa
Cohen's kappa is a statistic that measures the agreement between two raters on categorical data, correcting for chance agreement. It is calculated by comparing the observed agreement to the agreement expected by chance alone. A higher kappa value indicates a greater level of agreement beyond random chance.
