Measuring inter-rater reliability for nominal data – which coefficients and confidence intervals are appropriate? | BMC Medical Research Methodology | Full Text
PDF) Measuring inter-rater reliability for nominal data - Which coefficients and confidence intervals are appropriate?
Cohen's kappa - Wikipedia
PDF) Guidelines of the minimum sample size requirements for Cohen's Kappa
Cohen's Kappa: Learn It, Use It, Judge It | KNIME
Agree or Disagree? A Demonstration of An Alternative Statistic to Cohen's Kappa for Measuring the Extent and Reliability of Ag
Interpretation of Kappa Values. The kappa statistic is frequently used… | by Yingting Sherry Chen | Towards Data Science
Agree or Disagree? A Demonstration of An Alternative Statistic to Cohen's Kappa for Measuring the Extent and Reliability of Ag
Measuring Inter-coder Agreement - ATLAS.ti
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
Krippendorff's Alpha - We ask and you answer! The best answer wins! - Benchmark Six Sigma Forum
The Equivalence of Weighted Kappa and the Intraclass Correlation Coefficient as Measures of Reliability - Joseph L. Fleiss, Jacob Cohen, 1973
Cohen's Kappa | Real Statistics Using Excel
Why Cohen's Kappa should be avoided as performance measure in classification | PLOS ONE
Inter-Coder Agreement in One-to-Many Classification: Fuzzy Kappa | PLOS ONE
Cohen's kappa - Wikipedia
ReCal3: Reliability for 3+ Coders – Deen Freelon, Ph.D.