Adjusted probability naive bayesian induction

Geoffrey I. Webb, Michael J. Pazzani

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

52 Citations (Scopus)

Abstract

Naive Bayesian classifiers utilise a simple mathematical model for induction. While it is known that the assumptions on which this model is based are frequently violated, the predictive accuracy obtained in discriminate classification tasks is surprisingly competitive in comparison to more complex induction techniques. Adjusted probability naive Bayesian induction adds a simple extension to the naive Bayesian classifter. A numeric weight is inferred for each class. During discriminate classification, the naive Bayesian probability of a class is multiplied by its weight to obtain an adjusted value. The use of this adjusted value in place of the naive Bayesian probability is shown to significantly improve predictive accuracy.

Original languageEnglish
Title of host publicationAdvanced Topics in Artificial Intelligence - 11th Australian Joint Conference on Artificial Intelligence, AI 1998, Selected Papers
EditorsGrigoris Antoniou, John Slaney
PublisherSpringer
Pages285-295
Number of pages11
ISBN (Print)3540651381, 9783540651383
Publication statusPublished - 1 Jan 1998
Externally publishedYes
EventAustralasian Joint Conference on Artificial Intelligence 1998 - Brisbane, Australia
Duration: 13 Jul 199817 Jul 1998
Conference number: 11th
https://link.springer.com/book/10.1007/BFb0095035 (Proceedings)

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume1502
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceAustralasian Joint Conference on Artificial Intelligence 1998
Abbreviated titleAI 1998
Country/TerritoryAustralia
CityBrisbane
Period13/07/9817/07/98
Internet address

Cite this