![Metrics to evaluate classification models with R codes: Confusion Matrix, Sensitivity, Specificity, Cohen's Kappa Value, Mcnemar's Test - DATA SCIENCE VIDHYA Metrics to evaluate classification models with R codes: Confusion Matrix, Sensitivity, Specificity, Cohen's Kappa Value, Mcnemar's Test - DATA SCIENCE VIDHYA](http://datasciencevidhya.com/storage/post/Metrics%20to%20evaluate%20classification%20models%20with%20R%20codes%20Confusion%20Matrix,%20Sensitivity,%20Specificity,%20Cohen%E2%80%99s%20Kappa%20Value,%20Mcnemar's%20Test_1572255179.png)
Metrics to evaluate classification models with R codes: Confusion Matrix, Sensitivity, Specificity, Cohen's Kappa Value, Mcnemar's Test - DATA SCIENCE VIDHYA
![Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science](https://miro.medium.com/max/1258/0*xoNLU_pV4uLzpAWp.png)
Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science
Cohen's Kappa: What it is, when to use it, and how to avoid its pitfalls | by Rosaria Silipo | Towards Data Science
![Matthews Correlation Coefficient is The Best Classification Metric You've Never Heard Of | by Boaz Shmueli | Towards Data Science Matthews Correlation Coefficient is The Best Classification Metric You've Never Heard Of | by Boaz Shmueli | Towards Data Science](https://miro.medium.com/max/1220/1*4JzF9DBtUNt5sacyVUp8Dg.png)