Abstract:
|
It is long known that the statistical analysis using the standard linear models theory is often vitiated because the underlying assumptions of normality and homoscedasticity are not met. In case of minor departures from these assumptions, the well-known Box-Cox transformations may remedy the problem. However, if the data are very skewed, as well as heteroscedastic, then more drastic alternatives--such as use of inverse Gaussian (IG) models, a family of of positively skewed distributions--may be more appropriate and convenient. The inference methods associated with the inverse Gaussian family are amazingly analogous to the normal theory methods. For example, in both cases, the optimum tests for testing homogeneity of several means use F distribution. Many such similarities have been recently discovered and are tabulated in Mudholkar and Natarajan (2002). However, inverse Gaussian models may also require data massaging in order to improve its model conformity. In this paper, we develop and demonstrate the remarkable analogy of the Box-Cox methodology between the normal and inverse Gaussian cases. The one-way classification model is used as an illustration.
|