JSM 2013 Home
Online Program Home
My Program

Abstract Details

Activity Number: 48
Type: Invited
Date/Time: Sunday, August 4, 2013 : 4:00 PM to 5:50 PM
Sponsor: IMS
Abstract - #310465
Title: Sparse Mixture of Experts Learning: Algorithms and Properties
Author(s): Yu Zhu*+ and Han Wu
Companies: Purdue University and Purdue University
Keywords: Mixture of Experts Model ; regression ; classification ; sparsity ; L1 penalty
Abstract:

The mixture of experts (ME) and hierarchical mixture of experts (HME) models provide a flexible framework for solving general regression and classification problems. In the past 20 years, a variety of algorithms have been developed for training these models, their statistical properties have been understood to a certain degree, and the models have been applied in various applications. One existing limitation is that the models and algorithms do not perform well when the number of variables is extremely large. Under the sparsity assumption, we propose to incorporate L1 penalty into the ME and HME models to cope with high dimensionality. New learning algorithms are developed, and the theoretical properties of the proposed methods are investigated. The performance of the proposed methods will be demonstrated in a real life application problem.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2013 program




2013 JSM Online Program Home

For information, contact jsm@amstat.org or phone (888) 231-3473.

If you have questions about the Continuing Education program, please contact the Education Department.

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

ASA Meetings Department  •  732 North Washington Street, Alexandria, VA 22314  •  (703) 684-1221  •  meetings@amstat.org
Copyright © American Statistical Association.