Online Program Home
  My Program

Abstract Details

Activity Number: 40 - Statistical Learning: Theory and Methods
Type: Contributed
Date/Time: Sunday, July 30, 2017 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #322439 View Presentation
Title: A Smoothed Monotonic Regression via L2 Regularization
Author(s): Oleg Sysoev* and Oleg Burdakov
Companies: Linkoping University and Linkoping University
Keywords: monotonic regression ; regularization ; kernel methods ; smoothing ; big data
Abstract:

Monotonic Regression (MR) is a standard method for extracting a monotonic function from non-monotonic data, and it is used in many applications. However, a known drawback of this method is that its fitted response is a piecewise constant function, while practical response functions are often required to be continuous. We propose a method that achieves monotonicity and smoothness of the regression by introducing an L2 regularization term, and it is shown that the complexity of this method is O(n2). In addition, our simulations demonstrate that the proposed method has a higher predictive power than many other existing methods when large data are involved and the expected response is a complex function (which is typical in data science) or when there is a change point in the data. Our approach is probabilistically motivated and has connections to the Bayesian modeling.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association