Skip to content

Include Cohen's kappa in confusion matrices #26

@cimentadaj

Description

@cimentadaj

Whe talking about confusion matrices you should add cohen's kappa. You can refer to this paper: https://www.nature.com/articles/s41467-020-20206-z.pdf using it as a measure to check that two coders correctly classified search queries the same way. The difference between Cohen's Kappa and simply accuracy is is that Cohen’s Kappa is a quantitative measure of reliability for two raters that are rating the same thing, corrected for how often that the raters may agree by chance. See how they use it here: https://github.com/stefanjwojcik/ms_flu/blob/master/Survey%20and%20MRP/Main_Replication_Code.R#L98 and an explanation here: https://towardsdatascience.com/cohens-kappa-9786ceceab58

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions