Abstract
Under local DeRobertis (LDR) separation measures, the posterior distances between two densities is the same as between the prior densities. Like Kullback - Leibler separation they also are additive under factorization. These two properties allow us to prove that the precise specification of the prior will not be critical with respect to the variation distance on the posteriors under the following conditions. The genuine and approximating prior need to be similarly rough, the approximating prior has concentrated on a small ball on the margin of interest, not on the boundary of the probability space, and the approximating prior has similar or fatter tails to the genuine prior. Robustness then follows for all likelihoods, even ones that are misspecified. Furthermore, the variation distances can be bounded explicitly by an easy to calculate function of the prior LDR separation measures and simple summary statistics of the functioning posterior. In this paper we apply these results to study the robustness of prior specification to learning Bayesian Networks.
Original language | English |
---|---|
Title of host publication | Proceedings of the 4th European Workshop on Probabilistic Graphical Models, PGM 2008 |
Pages | 265-272 |
Number of pages | 8 |
Publication status | Published - 1 Dec 2008 |
Externally published | Yes |
Event | 4th European Workshop on Probabilistic Graphical Models, PGM 2008 - Hirtshals, Denmark Duration: 17 Sept 2008 → 19 Sept 2008 |
Conference
Conference | 4th European Workshop on Probabilistic Graphical Models, PGM 2008 |
---|---|
Country/Territory | Denmark |
City | Hirtshals |
Period | 17/09/08 → 19/09/08 |
ASJC Scopus subject areas
- Statistics and Probability