Online Program Home
My Program

Abstract Details

Activity Number: 253 - Contributed Poster Presentations: Section on Statistical Computing
Type: Contributed
Date/Time: Monday, July 30, 2018 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Computing
Abstract #329532
Title: A "divide and Conquer" Approach to Estimating Optimal Penalty Parameter in Functional Tikhonov-Regularized Regression Model via Leave-One-Out Cross-Validation
Author(s): Yichuan Wang* and Wolfgang Polonik and Alexander Aue
Companies: UC Davis and University of California, Davis and University of California, Davis
Keywords: Functional regression; Big data; Tikhonov regularization; Divide and conquer

Many methods of functional regression, especially the model where both predictor and response are functions, have resorted to a multivariate approach via dimension reduction. A functional Tikhonov-regularized regression algorithm was proposed without the need of dimension reduction. However the search for optimal regularization parameter via leave-one-out cross-validation was highly inefficient for large dataset. We present a "divide and conquer" approach to obtain a weighted average of the optimal parameters in Tikhonov regularization from multiple subsets. The weights were derived from the leave-one-out cross-validation criterion with undesirable parameter estimates removed. We were able to greatly reduce the computational complexity while achieving similar performance in prediction. Additionally we applied a regression model between the regularization parameter and sample size to extrapolate the optimal parameter utilizing the asymptotic properties of the mean square error. The later method turned out to be comparable in performance and efficiency while providing more reliable results. A simulation study was used to demonstrate the properties of our methods.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program