Abstract
Naive Bayes (NB) network is a popular classification technique for data mining and machine learning. Many methods exist to improve the performance of NB by overcoming its primary weakness the assumption that attributes are conditionally independent given the class, using techniques such as backwards sequential elimination and lazy elimination. Some weighting technologies, including attribute weighting and instance weighting, have also been proposed to improve the accuracy of NB. In this paper, we propose a dual weighted model, namely DWNB, for NB classification. In DWNB, we firstly employ an instance similarity based method to weight each training instance. After that, we build an attribute weighted model based on the new training data, where the calculation of the probability value is based on the embedded instance weights. The dual instance and attribute weighting allows DWNB to tackle the conditional independence assumption for accurate classification. Experiments and comparisons on 36 benchmark data sets demonstrate that DWNB outperforms existing weighted NB algorithms.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2014 International Joint Conference on Neural Networks |
Editors | Fakhri Karray, Cesare Alippi |
Place of Publication | Piscataway NJ USA |
Publisher | IEEE, Institute of Electrical and Electronics Engineers |
Pages | 1675-1679 |
Number of pages | 5 |
ISBN (Electronic) | 9781479914845, 9781479966271 |
ISBN (Print) | 9781479914821 |
DOIs | |
Publication status | Published - 2014 |
Externally published | Yes |
Event | IEEE International Joint Conference on Neural Networks 2014 - Beijing, China Duration: 6 Jul 2014 → 11 Jul 2014 https://ieeexplore.ieee.org/xpl/conhome/6880678/proceeding (Proceedings) |
Conference
Conference | IEEE International Joint Conference on Neural Networks 2014 |
---|---|
Abbreviated title | IJCNN 2014 |
Country/Territory | China |
City | Beijing |
Period | 6/07/14 → 11/07/14 |
Internet address |