Information theory is a mathematical study of coding along with the quantification, storage, and communication of information. It is widely used in many scientific fields, such as statistics, biology, artificial intelligence and statistical physics. In particular, it is a vital part of probability theory, with deep connections to statistical inference in terms of relative entropy or Kullback-Leibler (KL) divergence which measure the difference between two probability distributions. In this study, we have shown the minimization problem with KL divergence plays a key role in one-way Analysis of Variance (ANOVA) when comparing means of different populations. As immediate generalization, a new semi-parametric approach is introduced which relax assumptions in ANOVA. The proposed method not only compares the means but also variances of any type of distributions. Simulation studies show that our method has favorable performance than classical ANOVA. The method is demonstrated on meteorological radar data and credit limit data. Asymptotic normal distribution of the proposed estimators are established in order to test the hypothesis for equality of distributions.