Bayesian network classifiers using ensembles and smoothing

He Zhang, François Petitjean, Wray Buntine

Research output: Contribution to journalArticleResearchpeer-review

10 Citations (Scopus)


Bayesian network classifiers are, functionally, an interesting class of models, because they can be learnt out-of-core, i.e. without needing to hold the whole training data in main memory. The selective K-dependence Bayesian network classifier (SKDB) is state of the art in this class of models and has shown to rival random forest (RF) on problems with categorical data. In this paper, we introduce an ensembling technique for SKDB, called ensemble of SKDB (ESKDB). We show that ESKDB significantly outperforms RF on categorical and numerical data, as well as rivalling XGBoost. ESKDB combines three main components: (1) an effective strategy to vary the networks that is built by single classifiers (to make it an ensemble), (2) a stochastic discretization method which allows to both tackle numerical data as well as further increases the variance between different components of our ensemble and (3) a superior smoothing technique to ensure proper calibration of ESKDB’s probabilities. We conduct a large set of experiments with 72 datasets to study the properties of ESKDB (through a sensitivity analysis) and show its competitiveness with the state of the art.

Original languageEnglish
Pages (from-to)3457-3480
Number of pages24
JournalKnowledge and Information Systems
Publication statusPublished - 30 Mar 2020


  • Attribute discretization
  • Bayesian network classifier
  • Ensemble learning
  • Hierarchical Dirichlet process
  • Probability smoothing

Cite this