Online Program Home
My Program

Abstract Details

Activity Number: 258
Type: Contributed
Date/Time: Monday, August 1, 2016 : 2:00 PM to 3:50 PM
Sponsor: Section on Bayesian Statistical Science
Abstract #321153 View Presentation
Title: Bayesian Statistics and Information Theory
Author(s): Jose Guardiola*
Companies: Texas A&M University - Corpus Christi
Keywords: Kullback-Leibler ; entropy ; Fisher's information ; information theory ; Bayesian learning ; cross entropy
Abstract:

This paper discusses an example of Bayesian statistical inference and its relationship with entropy, the Kullback-Leibler divergence, and Fisher's information. Using an example from science it shows how the information gain in Bayesian updating can be measured using the Kullback-Leibler divergence or cross entropy, change in entropy, and Fisher's information. This example discusses the relationship between these measurements and provides a geometric interpretation for Riemannian distances and pseudo-distances. A numeric example is developed and detailed results are discussed under information theory and statistical point of views by comparing related quantities. Bayesian inference results and theory are interpreted using information concepts, entropy and statistical measurements, finally some conclusions are drawn regarding the information gain and relationship with other statistical procedures.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

 
 
Copyright © American Statistical Association