Research output per year
Research output per year
Sunil Aryal, Kai Ming Ting
Research output: Contribution to journal › Article › Research › peer-review
In Bayesian classifier learning, estimating the joint probability distribution p(x,y) or the likelihood p(x|y) directly from training data is considered to be difficult, especially in large multidimensional data sets. To circumvent this difficulty, existing Bayesian classifiers such as Naive Bayes, BayesNet, and AηDE have focused on estimating simplified surrogates of p(x,y) from different forms of one-dimensional likelihoods. Contrary to the perceived difficulty in multidimensional likelihood estimation, we present a simple generic ensemble approach to estimate multidimensional likelihood directly from data. The idea is to aggregate p_{i}(x|y) estimated from a random subsample of data D_{i}(i=1,2,....,t). This article presents two ways to estimate multidimensional likelihoods using the proposed generic approach and introduces two new Bayesian classifiers called ENNBayes and MassBayes that estimate p_{i}(x|y) using a nearest-neighbor density estimation and a probability estimation through feature space partitioning, respectively. Unlike the existing Bayesian classifiers, ENNBayes and MassBayes have constant training time and space complexities and they scale better than existing Bayesian classifiers in very large data sets. Our empirical evaluation shows that ENNBayes and MassBayes yield better predictive accuracy than the existing Bayesian classifiers in benchmark data sets.
Original language | English |
---|---|
Pages (from-to) | 458-479 |
Number of pages | 22 |
Journal | Computational Intelligence |
Volume | 32 |
Issue number | 3 |
DOIs | |
Publication status | Published - 1 Aug 2016 |
Research output: Chapter in Book/Report/Conference proceeding › Conference Paper › Research › peer-review