Skip to content

Naive Bayes Classifier #1237

Discussion options

You must be logged in to vote

Have you Googled “Naive Bayes Kappa”?

https://www.hindawi.com/journals/mpe/2014/383671/

Kappa measures the agreement between two raters who each classify items into mutually exclusive categories. Kappa is computed as formula (19), is the observed agreement among the raters, and is the expected agreement; that is, represents the probability that the raters agree by chance. The values of Kappa are constrained to the interval . Kappa = 1 means perfect agreement, Kappa = 0 means that agreement is equal to chance, and Kappa = −1 means “perfect” disagreement:

For data with such a distribution, if you answer "all -1", the correct answer rate will be high. Even if the correct answer rate is hig…

Replies: 4 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by ko-ichi-h
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #1236 on March 12, 2024 13:10.