Dual instance and attribute weighting for Naive Bayes classification

Jia Wu, Shirui Pan, Zhihua Cai, Xingquan Zhu, Chengqi Zhang

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

7 Citations (Scopus)


Naive Bayes (NB) network is a popular classification technique for data mining and machine learning. Many methods exist to improve the performance of NB by overcoming its primary weakness the assumption that attributes are conditionally independent given the class, using techniques such as backwards sequential elimination and lazy elimination. Some weighting technologies, including attribute weighting and instance weighting, have also been proposed to improve the accuracy of NB. In this paper, we propose a dual weighted model, namely DWNB, for NB classification. In DWNB, we firstly employ an instance similarity based method to weight each training instance. After that, we build an attribute weighted model based on the new training data, where the calculation of the probability value is based on the embedded instance weights. The dual instance and attribute weighting allows DWNB to tackle the conditional independence assumption for accurate classification. Experiments and comparisons on 36 benchmark data sets demonstrate that DWNB outperforms existing weighted NB algorithms.

Original languageEnglish
Title of host publicationProceedings of the 2014 International Joint Conference on Neural Networks
EditorsFakhri Karray, Cesare Alippi
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages5
ISBN (Electronic)9781479914845, 9781479966271
ISBN (Print)9781479914821
Publication statusPublished - 2014
Externally publishedYes
EventIEEE International Joint Conference on Neural Networks 2014 - Beijing, China
Duration: 6 Jul 201411 Jul 2014
https://ieeexplore.ieee.org/xpl/conhome/6880678/proceeding (Proceedings)


ConferenceIEEE International Joint Conference on Neural Networks 2014
Abbreviated titleIJCNN 2014
Internet address

Cite this