JSM 2012 Home

JSM 2012 Online Program

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

Online Program Home

Abstract Details

Activity Number: 458
Type: Topic Contributed
Date/Time: Wednesday, August 1, 2012 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Computing
Abstract - #304927
Title: Expectation Maximization for Distributed Computing
Author(s): Glen DePalma*+ and Sanvesh Srivastava and Chuanhai Liu
Companies: Purdue University and Purdue University and Purdue University
Address: Department of Statistics, West Lafayette, IN, 47907, United States
Keywords: EM ; Distributed Computing ; Parallel Computing ; Computational ; R

The Expectation-Maximization (EM) algorithm (Dempster, Laird, and Rubin, 1977) remains a hallmark achievement in the history of Statistics and optimization. Most of the research in the implementation of EM algorithm addresses the fundamental issue of making the EM algorithm efficient by speeding up its rate of convergence. Despite the widespread attention and substantial effort over the past 30+ years, however, there has been no systematic attempt to take advantage of distributed computing. This issue is related to the underlying iterative nature of the algorithm that makes EM difficult to parallelize. We extend the basic EM algorithm to a parallel framework that takes advantage of cluster computing and multiprocessing environments in computer architecture. Furthermore, we provide an R (R Development Core Team, 2012) implementation based on the developmental DISC package that facilitates the easy use of the parallel-EM framework.

The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.

Back to the full JSM 2012 program

2012 JSM Online Program Home

For information, contact jsm@amstat.org or phone (888) 231-3473.

If you have questions about the Continuing Education program, please contact the Education Department.