Efficient parameter learning of Bayesian network classifiers

Nayyar A Zaidi, Geoffrey I. Webb, Mark Carman, François Petitjean, Wray Buntine, Mike Hynes, Hans De Sterck

Research output: Contribution to journalArticleResearchpeer-review

Abstract

Recent advances have demonstrated substantial benefits from learning with both generative and discriminative parameters. On the one hand, generative approaches address the estimation of the parameters of the joint distribution—(Formula presented.), which for most network types is very computationally efficient (a notable exception to this are Markov networks) and on the other hand, discriminative approaches address the estimation of the parameters of the posterior distribution—and, are more effective for classification, since they fit (Formula presented.) directly. However, discriminative approaches are less computationally efficient as the normalization factor in the conditional log-likelihood precludes the derivation of closed-form estimation of parameters. This paper introduces a new discriminative parameter learning method for Bayesian network classifiers that combines in an elegant fashion parameters learned using both generative and discriminative methods. The proposed method is discriminative in nature, but uses estimates of generative probabilities to speed-up the optimization process. A second contribution is to propose a simple framework to characterize the parameter learning task for Bayesian network classifiers. We conduct an extensive set of experiments on 72 standard datasets and demonstrate that our proposed discriminative parameterization provides an efficient alternative to other state-of-the-art parameterizations.

Original languageEnglish
Pages (from-to)1289-1329
Number of pages41
JournalMachine Learning
Volume106
Issue number9-10
DOIs
Publication statusPublished - 2017

Cite this

Zaidi, Nayyar A ; Webb, Geoffrey I. ; Carman, Mark ; Petitjean, François ; Buntine, Wray ; Hynes, Mike ; De Sterck, Hans. / Efficient parameter learning of Bayesian network classifiers. In: Machine Learning. 2017 ; Vol. 106, No. 9-10. pp. 1289-1329.
@article{8fd9853fb8d54b15a3ceaaff3d4d15fa,
title = "Efficient parameter learning of Bayesian network classifiers",
abstract = "Recent advances have demonstrated substantial benefits from learning with both generative and discriminative parameters. On the one hand, generative approaches address the estimation of the parameters of the joint distribution—(Formula presented.), which for most network types is very computationally efficient (a notable exception to this are Markov networks) and on the other hand, discriminative approaches address the estimation of the parameters of the posterior distribution—and, are more effective for classification, since they fit (Formula presented.) directly. However, discriminative approaches are less computationally efficient as the normalization factor in the conditional log-likelihood precludes the derivation of closed-form estimation of parameters. This paper introduces a new discriminative parameter learning method for Bayesian network classifiers that combines in an elegant fashion parameters learned using both generative and discriminative methods. The proposed method is discriminative in nature, but uses estimates of generative probabilities to speed-up the optimization process. A second contribution is to propose a simple framework to characterize the parameter learning task for Bayesian network classifiers. We conduct an extensive set of experiments on 72 standard datasets and demonstrate that our proposed discriminative parameterization provides an efficient alternative to other state-of-the-art parameterizations.",
author = "Zaidi, {Nayyar A} and Webb, {Geoffrey I.} and Mark Carman and Fran{\cc}ois Petitjean and Wray Buntine and Mike Hynes and {De Sterck}, Hans",
year = "2017",
doi = "10.1007/s10994-016-5619-z",
language = "English",
volume = "106",
pages = "1289--1329",
journal = "Machine Learning",
issn = "0885-6125",
publisher = "Springer",
number = "9-10",

}

Efficient parameter learning of Bayesian network classifiers. / Zaidi, Nayyar A; Webb, Geoffrey I.; Carman, Mark; Petitjean, François; Buntine, Wray; Hynes, Mike; De Sterck, Hans.

In: Machine Learning, Vol. 106, No. 9-10, 2017, p. 1289-1329.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Efficient parameter learning of Bayesian network classifiers

AU - Zaidi, Nayyar A

AU - Webb, Geoffrey I.

AU - Carman, Mark

AU - Petitjean, François

AU - Buntine, Wray

AU - Hynes, Mike

AU - De Sterck, Hans

PY - 2017

Y1 - 2017

N2 - Recent advances have demonstrated substantial benefits from learning with both generative and discriminative parameters. On the one hand, generative approaches address the estimation of the parameters of the joint distribution—(Formula presented.), which for most network types is very computationally efficient (a notable exception to this are Markov networks) and on the other hand, discriminative approaches address the estimation of the parameters of the posterior distribution—and, are more effective for classification, since they fit (Formula presented.) directly. However, discriminative approaches are less computationally efficient as the normalization factor in the conditional log-likelihood precludes the derivation of closed-form estimation of parameters. This paper introduces a new discriminative parameter learning method for Bayesian network classifiers that combines in an elegant fashion parameters learned using both generative and discriminative methods. The proposed method is discriminative in nature, but uses estimates of generative probabilities to speed-up the optimization process. A second contribution is to propose a simple framework to characterize the parameter learning task for Bayesian network classifiers. We conduct an extensive set of experiments on 72 standard datasets and demonstrate that our proposed discriminative parameterization provides an efficient alternative to other state-of-the-art parameterizations.

AB - Recent advances have demonstrated substantial benefits from learning with both generative and discriminative parameters. On the one hand, generative approaches address the estimation of the parameters of the joint distribution—(Formula presented.), which for most network types is very computationally efficient (a notable exception to this are Markov networks) and on the other hand, discriminative approaches address the estimation of the parameters of the posterior distribution—and, are more effective for classification, since they fit (Formula presented.) directly. However, discriminative approaches are less computationally efficient as the normalization factor in the conditional log-likelihood precludes the derivation of closed-form estimation of parameters. This paper introduces a new discriminative parameter learning method for Bayesian network classifiers that combines in an elegant fashion parameters learned using both generative and discriminative methods. The proposed method is discriminative in nature, but uses estimates of generative probabilities to speed-up the optimization process. A second contribution is to propose a simple framework to characterize the parameter learning task for Bayesian network classifiers. We conduct an extensive set of experiments on 72 standard datasets and demonstrate that our proposed discriminative parameterization provides an efficient alternative to other state-of-the-art parameterizations.

UR - http://www.scopus.com/inward/record.url?scp=85010789888&partnerID=8YFLogxK

U2 - 10.1007/s10994-016-5619-z

DO - 10.1007/s10994-016-5619-z

M3 - Article

VL - 106

SP - 1289

EP - 1329

JO - Machine Learning

JF - Machine Learning

SN - 0885-6125

IS - 9-10

ER -