Abstract:
|
Modern graphical tools have enhanced our ability to learn many things from data directly, but the issue is what to plot with a high-dimensional data set. Thus feature extraction shows its importance, of which one interesting tool is dimension reduction technique. Based on information theory, here we develop a dimension reduction technique for extracting important information. Our method in general could be applied in any area relating to information extraction, since it can be viewed as inverse regression. Here, particularly, we compare this method with other dimension reduction methods in regression such as SIR (Li 1991), PHD (Li 1992), SAVE (Cook and Weisberg 1991), and covk (Yin and Cook 2000) in regression. Depending on what you view, this method may not be thought as inverse regression at all. It involves density estimation which can be estimated non-parametrically. And that is feasible with the help of fast computing techniques. We provide theoretical justification for connecting with central subspace. Also, useful examples are presented.
|