The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Abstract Details
Activity Number:
|
63
|
Type:
|
Topic Contributed
|
Date/Time:
|
Sunday, July 31, 2011 : 4:00 PM to 5:50 PM
|
Sponsor:
|
Section on Statistical Learning and Data Mining
|
Abstract - #302962 |
Title:
|
Reduced Rank Ridge Regression and Its Kernel Extension
|
Author(s):
|
Ashin Mukherjee*+ and Ji Zhu
|
Companies:
|
University of Michigan and University of Michigan
|
Address:
|
, , ,
|
Keywords:
|
statistical learning ;
Reduced Rank Regression ;
RKHS
|
Abstract:
|
In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be due to the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Numerical studies indicate that the proposed method consistently outperforms relevant competitors. A novel extension of the proposed method to the reproducing kernel Hilbert space (RKHS) set-up is also developed.
|
The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.
Back to the full JSM 2011 program
|
2011 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Continuing Education program, please contact the Education Department.