Abstract
Lazy Bayesian Rules modifies naive Bayesian classification to undo elements of the harmful attribute independence assumption. It has been shown to provide classification error comparable to boosting decision trees. This paper explores alternatives to the candidate elimination criterion employed within Lazy Bayesian Rules. Improvements over naive Bayes are consistent so long as the candidate elimination criteria ensures there is sufficient data for accurate probability estimation. However, the original candidate elimination criterion is demonstrated to provide better overall error reduction than the use of a minimum data subset size criterion.
Original language | English |
---|---|
Title of host publication | AI 2001: Advances in Artificial Intelligence |
Subtitle of host publication | 14th Australian Joint Conference on Artificial Intelligence Adelaide, Australia, December 10-14, 2001 Proceedings |
Editors | Markus Stumptner, Dan Corbett, Mike Brooks |
Place of Publication | Berlin Germany |
Publisher | Springer |
Pages | 545-556 |
Number of pages | 12 |
ISBN (Print) | 3540429603 |
DOIs | |
Publication status | Published - 2001 |
Event | Australasian Joint Conference on Artificial Intelligence 2001 - Adelaide, Australia Duration: 10 Dec 2001 → 14 Dec 2001 Conference number: 14th https://link.springer.com/book/10.1007/3-540-45656-2 (Proceedings) |
Publication series
Name | Lecture Notes in Computer Science |
---|---|
Publisher | Springer |
Volume | 2256 |
ISSN (Print) | 0302-9743 |
Conference
Conference | Australasian Joint Conference on Artificial Intelligence 2001 |
---|---|
Abbreviated title | AI 2001 |
Country/Territory | Australia |
City | Adelaide |
Period | 10/12/01 → 14/12/01 |
Internet address |
|