Projects per year
Abstract
Bayesian network classifiers are, functionally, an interesting class of models, because they can be learnt out-of-core, i.e. without needing to hold the whole training data in main memory. The selective K-dependence Bayesian network classifier (SKDB) is state of the art in this class of models and has shown to rival random forest (RF) on problems with categorical data. In this paper, we introduce an ensembling technique for SKDB, called ensemble of SKDB (ESKDB). We show that ESKDB significantly outperforms RF on categorical and numerical data, as well as rivalling XGBoost. ESKDB combines three main components: (1) an effective strategy to vary the networks that is built by single classifiers (to make it an ensemble), (2) a stochastic discretization method which allows to both tackle numerical data as well as further increases the variance between different components of our ensemble and (3) a superior smoothing technique to ensure proper calibration of ESKDB’s probabilities. We conduct a large set of experiments with 72 datasets to study the properties of ESKDB (through a sensitivity analysis) and show its competitiveness with the state of the art.
Original language | English |
---|---|
Pages (from-to) | 3457-3480 |
Number of pages | 24 |
Journal | Knowledge and Information Systems |
Volume | 62 |
DOIs | |
Publication status | Published - 30 Mar 2020 |
Keywords
- Attribute discretization
- Bayesian network classifier
- Ensemble learning
- Hierarchical Dirichlet process
- Probability smoothing
Projects
- 2 Finished
-
Target-agnostic analytics: Building agile predictive models for big data
Webb, G. (Primary Chief Investigator (PCI)) & Buntine, W. (Chief Investigator (CI))
1/04/19 → 30/06/22
Project: Research
-
Time series classification for new-generation Earth observation satellites
Petitjean, F. (Primary Chief Investigator (PCI))
1/06/17 → 31/12/20
Project: Research