Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 215 - Contributed Poster Presentations: Section on Statistical Learning and Data Science
Type: Contributed
Date/Time: Tuesday, August 4, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #313555
Title: Adjusting Factor Models for Concomitant Variables by Adversarial Learning
Author(s): Austin Talbot* and David Carlson and David Dunson
Companies: Duke University and Duke University and Duke University
Keywords: Fair machine learning; Factor analysis; Domain adaptation; Adversarial learning
Abstract:

Using factor models for dimensionality reduction is common when analyzing high dimensional data. Unfortunately, data often come with concomitant variables that can dominate the estimated latent representation, such as in fair machine learning and domain adaptation. We modify the objective function of dimensionality reduction methods to penalize the predictability of the concomitant variables. This yields a minimax formulation that finds a latent representation to simultaneously encode the primary data while being unpredictive of the concomitant data. We show three different minimax or adversarial solutions to this type of objective function, highlighting key differences between the formulations. Remarkably, using a PCA-like objective yields an analytic solution calculated by eigendecompositions on an augmented space. For general factor models, we show how neural networks can be used to efficiently approximate the objectives. We apply these techniques to both synthetic and real datasets, including electrophysiological recordings and survey data, to demonstrate that the estimated factors can yield better representations for common objectives.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program