Jose H. Guardiola
Texas A&M University Corpus Christi, Department of Mathematics and Statistics
Hassan Elsalloukh
University of Arkansas at Little Rock, Department of Mathematics and Statistics
![IconGems-Print](images/IconGems-Print.png)
656 – Introducing Bayesian Statistics at Courses of Various Levels
Maximum Entropy and Bayesian Learning
Jose H. Guardiola
Texas A&M University Corpus Christi, Department of Mathematics and Statistics
Hassan Elsalloukh
University of Arkansas at Little Rock, Department of Mathematics and Statistics
This paper discusses the relationship between the maximum entropy approach and Bayesian statistical inference, the Kullback-Leibler divergence, and Fisher’s information. Using an example from science it shows how the information gain in Bayesian updating can be measured using the Kullback-Leibler divergence or cross entropy, change in entropy, and Fisher’s information. This example discusses the relationship between these measurements. A numeric example is developed and detailed results are discussed under information theory and statistical point of views by comparing related quantities. Bayesian inference results and theory are interpreted using information concepts, entropy and statistical measurements, finally some conclusions are drawn regarding the information gain and relationships with other statistical procedures.