Computing divergences between discrete decomposable models

Loong Kuan Lee, Nico Piatkowski, François Petitjean, Geoffrey I. Webb

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

1 Citation (Scopus)


There are many applications that benefit from computing the exact divergence between 2 discrete probability measures, including machine learning. Unfortunately, in the absence of any assumptions on the structure or independencies within these distributions, computing the divergence between them is an intractable problem in high dimensions. We show that we are able to compute a wide family of functionals and divergences, such as the alpha-beta divergence, between two decomposable models, i.e. chordal Markov networks, in time exponential to the treewidth of these models. The alpha-beta divergence is a family of divergences that include popular divergences such as the Kullback-Leibler divergence, the Hellinger distance, and the chi-squared divergence. Thus, we can accurately compute the exact values of any of this broad class of divergences to the extent to which we can accurately model the two distributions using decomposable models.

Original languageEnglish
Title of host publicationProceedings of the 37th AAAI Conference on Artificial Intelligence
EditorsBrian Williams, Yiling Chen, Jennifer Neville
Place of PublicationWashington DC USA
PublisherAssociation for the Advancement of Artificial Intelligence (AAAI)
Number of pages9
ISBN (Electronic)9781577358800
Publication statusPublished - 2023
EventAAAI Conference on Artificial Intelligence 2023 - Washington, United States of America
Duration: 7 Feb 202314 Feb 2023
Conference number: 37th (Proceedings)

Publication series

NameProceedings of the AAAI Conference on Artificial Intelligence
PublisherAAAI Press
ISSN (Print)2159-5399
ISSN (Electronic)2374-3468


ConferenceAAAI Conference on Artificial Intelligence 2023
Abbreviated titleAAAI 2023
Country/TerritoryUnited States of America
Internet address

Cite this