Locally weighted learning

how and when does it work in Bayesian networks?

Jia Wu, Bi Wu, Shirui Pan, Haishuai Wang, Zhihua Cai

Research output: Contribution to journalArticleResearchpeer-review

Abstract

Bayesian network (BN), a simple graphical notation for conditional independence assertions, is promised to represent the probabilistic relationships between diseases and symptoms. Learning the structure of a Bayesian network classifier (BNC) encodes conditional independence assumption between attributes, which may deteriorate the classification performance. One major approach to mitigate the BNC’s primary weakness (the attributes independence assumption) is the locally weighted approach. And this type of approach has been proved to achieve good performance for naive Bayes, a BNC with simple structure. However, we do not know whether or how effective it works for improving the performance of the complex BNC. In this paper, we first do a survey on the complex structure models for BNCs and their improvements, then carry out a systematically experimental analysis to investigate the effectiveness of locally weighted method for complex BNCs, e.g., tree-augmented naive Bayes (TAN), averaged one-dependence estimators AODE and hidden naive Bayes (HNB), measured by classification accuracy (ACC) and the area under the ROC curve ranking (AUC). Experiments and comparisons on 36 benchmark data sets collected from University of California, Irvine (UCI) in Weka system demonstrate that locally weighting technologies just slightly outperforms unweighted complex BNCs on ACC and AUC. In other words, although locally weighting could significantly improve the performance of NB (a BNC with simple structure), it could not work well on BNCs with complex structures. This is because the performance improvements of BNCs are attributed to their structures not the locally weighting.

Original languageEnglish
Pages (from-to)63-74
Number of pages12
JournalInternational Journal of Computational Intelligence Systems
Volume8
DOIs
Publication statusPublished - Dec 2015
Externally publishedYes

Keywords

  • Bayesian network
  • Classification
  • Locally weighted learning
  • Ranking

Cite this

@article{2fac151657a44e0590114a0f13d9bab5,
title = "Locally weighted learning: how and when does it work in Bayesian networks?",
abstract = "Bayesian network (BN), a simple graphical notation for conditional independence assertions, is promised to represent the probabilistic relationships between diseases and symptoms. Learning the structure of a Bayesian network classifier (BNC) encodes conditional independence assumption between attributes, which may deteriorate the classification performance. One major approach to mitigate the BNC’s primary weakness (the attributes independence assumption) is the locally weighted approach. And this type of approach has been proved to achieve good performance for naive Bayes, a BNC with simple structure. However, we do not know whether or how effective it works for improving the performance of the complex BNC. In this paper, we first do a survey on the complex structure models for BNCs and their improvements, then carry out a systematically experimental analysis to investigate the effectiveness of locally weighted method for complex BNCs, e.g., tree-augmented naive Bayes (TAN), averaged one-dependence estimators AODE and hidden naive Bayes (HNB), measured by classification accuracy (ACC) and the area under the ROC curve ranking (AUC). Experiments and comparisons on 36 benchmark data sets collected from University of California, Irvine (UCI) in Weka system demonstrate that locally weighting technologies just slightly outperforms unweighted complex BNCs on ACC and AUC. In other words, although locally weighting could significantly improve the performance of NB (a BNC with simple structure), it could not work well on BNCs with complex structures. This is because the performance improvements of BNCs are attributed to their structures not the locally weighting.",
keywords = "Bayesian network, Classification, Locally weighted learning, Ranking",
author = "Jia Wu and Bi Wu and Shirui Pan and Haishuai Wang and Zhihua Cai",
year = "2015",
month = "12",
doi = "10.1080/18756891.2015.1129579",
language = "English",
volume = "8",
pages = "63--74",
journal = "International Journal of Computational Intelligence Systems",
issn = "1875-6891",
publisher = "Taylor & Francis",

}

Locally weighted learning : how and when does it work in Bayesian networks? / Wu, Jia; Wu, Bi; Pan, Shirui; Wang, Haishuai; Cai, Zhihua.

In: International Journal of Computational Intelligence Systems, Vol. 8, 12.2015, p. 63-74.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Locally weighted learning

T2 - how and when does it work in Bayesian networks?

AU - Wu, Jia

AU - Wu, Bi

AU - Pan, Shirui

AU - Wang, Haishuai

AU - Cai, Zhihua

PY - 2015/12

Y1 - 2015/12

N2 - Bayesian network (BN), a simple graphical notation for conditional independence assertions, is promised to represent the probabilistic relationships between diseases and symptoms. Learning the structure of a Bayesian network classifier (BNC) encodes conditional independence assumption between attributes, which may deteriorate the classification performance. One major approach to mitigate the BNC’s primary weakness (the attributes independence assumption) is the locally weighted approach. And this type of approach has been proved to achieve good performance for naive Bayes, a BNC with simple structure. However, we do not know whether or how effective it works for improving the performance of the complex BNC. In this paper, we first do a survey on the complex structure models for BNCs and their improvements, then carry out a systematically experimental analysis to investigate the effectiveness of locally weighted method for complex BNCs, e.g., tree-augmented naive Bayes (TAN), averaged one-dependence estimators AODE and hidden naive Bayes (HNB), measured by classification accuracy (ACC) and the area under the ROC curve ranking (AUC). Experiments and comparisons on 36 benchmark data sets collected from University of California, Irvine (UCI) in Weka system demonstrate that locally weighting technologies just slightly outperforms unweighted complex BNCs on ACC and AUC. In other words, although locally weighting could significantly improve the performance of NB (a BNC with simple structure), it could not work well on BNCs with complex structures. This is because the performance improvements of BNCs are attributed to their structures not the locally weighting.

AB - Bayesian network (BN), a simple graphical notation for conditional independence assertions, is promised to represent the probabilistic relationships between diseases and symptoms. Learning the structure of a Bayesian network classifier (BNC) encodes conditional independence assumption between attributes, which may deteriorate the classification performance. One major approach to mitigate the BNC’s primary weakness (the attributes independence assumption) is the locally weighted approach. And this type of approach has been proved to achieve good performance for naive Bayes, a BNC with simple structure. However, we do not know whether or how effective it works for improving the performance of the complex BNC. In this paper, we first do a survey on the complex structure models for BNCs and their improvements, then carry out a systematically experimental analysis to investigate the effectiveness of locally weighted method for complex BNCs, e.g., tree-augmented naive Bayes (TAN), averaged one-dependence estimators AODE and hidden naive Bayes (HNB), measured by classification accuracy (ACC) and the area under the ROC curve ranking (AUC). Experiments and comparisons on 36 benchmark data sets collected from University of California, Irvine (UCI) in Weka system demonstrate that locally weighting technologies just slightly outperforms unweighted complex BNCs on ACC and AUC. In other words, although locally weighting could significantly improve the performance of NB (a BNC with simple structure), it could not work well on BNCs with complex structures. This is because the performance improvements of BNCs are attributed to their structures not the locally weighting.

KW - Bayesian network

KW - Classification

KW - Locally weighted learning

KW - Ranking

UR - http://www.scopus.com/inward/record.url?scp=84983666737&partnerID=8YFLogxK

U2 - 10.1080/18756891.2015.1129579

DO - 10.1080/18756891.2015.1129579

M3 - Article

VL - 8

SP - 63

EP - 74

JO - International Journal of Computational Intelligence Systems

JF - International Journal of Computational Intelligence Systems

SN - 1875-6891

ER -