Large incomplete sample robustness in Bayesian networks

Jim Q. Smith, Alireza Daneshkhah

Research output: Chapter in Book/Report/Conference proceedingConference proceedingpeer-review

Abstract

Under local DeRobertis (LDR) separation measures, the posterior distances between two densities is the same as between the prior densities. Like Kullback - Leibler separation they also are additive under factorization. These two properties allow us to prove that the precise specification of the prior will not be critical with respect to the variation distance on the posteriors under the following conditions. The genuine and approximating prior need to be similarly rough, the approximating prior has concentrated on a small ball on the margin of interest, not on the boundary of the probability space, and the approximating prior has similar or fatter tails to the genuine prior. Robustness then follows for all likelihoods, even ones that are misspecified. Furthermore, the variation distances can be bounded explicitly by an easy to calculate function of the prior LDR separation measures and simple summary statistics of the functioning posterior. In this paper we apply these results to study the robustness of prior specification to learning Bayesian Networks.

Original languageEnglish
Title of host publicationProceedings of the 4th European Workshop on Probabilistic Graphical Models, PGM 2008
Pages265-272
Number of pages8
Publication statusPublished - 1 Dec 2008
Externally publishedYes
Event4th European Workshop on Probabilistic Graphical Models, PGM 2008 - Hirtshals, Denmark
Duration: 17 Sept 200819 Sept 2008

Conference

Conference4th European Workshop on Probabilistic Graphical Models, PGM 2008
Country/TerritoryDenmark
CityHirtshals
Period17/09/0819/09/08

ASJC Scopus subject areas

  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Large incomplete sample robustness in Bayesian networks'. Together they form a unique fingerprint.

Cite this