Comparing score-based methods for estimating Bayesian networks using the Kullback-Leibler divergence

Jessica Kasza, Patty Solomon

Research output: Contribution to journalArticleResearchpeer-review

3 Citations (Scopus)


We recently proposed two methods for estimating Bayesian networks from high-dimensional non-independent and identically distributed data containing exogenous variables and random effects (Kasza et al., 2012 Kasza, J.E., Glonek, G., Solomon, P. (2012). Estimating Bayesian networks for high-dimensional data with complex mean structure. Aust. New Zealand J. Stat. 54(2): 169–187.
[Crossref], [Web of Science ®], , [Google Scholar]
). The first method is fully Bayesian, and the second is “residual”-based, accounting for the effects of the exogenous variables by utilizing the notion of restricted maximum likelihood. We describe the methods and compare their performance using the Kullback–Leibler divergence, which provides a natural framework for comparing posterior distributions. In applications where the exogenous variables are not of primary interest, we show that the potential loss of information about parameters of interest is typically small.
Original languageEnglish
Pages (from-to)135-152
Number of pages18
JournalCommunications in Statistics: Theory and Methods
Issue number1
Publication statusPublished - 2015
Externally publishedYes


  • Bayesian network
  • Exogenous variables
  • High-dimensional data
  • KullbackLeibler divergence
  • Gene regulatory networks
  • Variance components

Cite this