Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 20 - Applications of Text Analysis
Type: Topic Contributed
Date/Time: Sunday, August 7, 2022 : 2:00 PM to 3:50 PM
Sponsor: Text Analysis Interest Group
Abstract #320877
Title: Representation Learning: A Causal Perspective
Author(s): Yixin Wang* and Michael Jordan
Companies: University of Michigan and UC Berkeley
Keywords: representation learning; causality

Representation learning constructs low-dimensional representations to summarize essential features of high-dimensional data like images and text. Ideally, such a representation should capture non-spurious features of the data in an efficient way. It shall also be disentangled so that we can freely manipulate each of its dimensions. However, these desiderata are often intuitively defined and challenging to quantify or enforce.

In this work, we take on a causal perspective of representation learning. We show how desiderata of representation learning can be formalized using counterfactual notions, which then enables algorithms that target efficient, non-spurious, and disentangled representations of data. We discuss the theoretical underpinnings of the algorithm and illustrate its empirical performance in both supervised and unsupervised representation learning.

This is joint work with Michael Jordan.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program