Online Program Home
My Program

Abstract Details

Activity Number: 509 - Statistical Methodology
Type: Contributed
Date/Time: Wednesday, July 31, 2019 : 10:30 AM to 12:20 PM
Sponsor: IMS
Abstract #305103
Title: Analysis of Variance Models Through Information Theory
Author(s): Chathurangi Heshani Pathiravasan* and Bhaskar Bhattacharya
Companies: Southern Illinois University and Southern Illinois University
Keywords: Kullback-Leibler divergence; Analysis of Variance (ANOVA); Asymptotic normal distribution; Information Theory
Abstract:

Information theory is a mathematical study of coding along with the quantification, storage, and communication of information. It is widely used in many scientific fields, such as statistics, biology, artificial intelligence and statistical physics. In particular, it is a vital part of probability theory, with deep connections to statistical inference in terms of relative entropy or Kullback-Leibler (KL) divergence which measure the difference between two probability distributions. In this study, we have shown the minimization problem with KL divergence plays a key role in one-way Analysis of Variance (ANOVA) when comparing means of different populations. As immediate generalization, a new semi-parametric approach is introduced which relax assumptions in ANOVA. The proposed method not only compares the means but also variances of any type of distributions. Simulation studies show that our method has favorable performance than classical ANOVA. The method is demonstrated on meteorological radar data and credit limit data. Asymptotic normal distribution of the proposed estimators are established in order to test the hypothesis for equality of distributions.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program