Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 496 - Dimension Reduction
Type: Contributed
Date/Time: Thursday, August 6, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #313217
Title: Learning Hierarchical Structures in Latent Attribute Models
Author(s): Chenchen Ma* and Gongjun Xu
Companies: University of Michigan and University of Michigan
Keywords: latent variable models; mixture models; hierarchical structures; regularization; sparse estimation
Abstract:

Hierarchical Latent Attribute Models (HLAMs) are a special family of restricted discrete latent variable models widely used in social and biomedical sciences. In many applications, certain hierarchical constraints are put on allowable patterns of the latent attributes. For instance, some lower-level attributes are assumed to be prerequisites for higher-level attributes. This paper considers the problem of learning latent hierarchical structures from noisy observations with minimal model assumptions. A regularized latent attribute model is proposed, and an EM-type algorithm is developed for efficient and scalable computation. We further show that the proposed approach enjoys nice theoretical properties. The good performance of the proposed methodology is illustrated by extensive simulation studies and a real dataset in educational assessment.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program