Online Program Home
  My Program

Abstract Details

Activity Number:
Register
214606 - Improve Your Regression with Modern Regression Analysis Techniques: Linear, Logistic, Nonlinear, Regularized, GPS, LARS, LASSO, Elastic Net, MARS, TreeNet Gradient Boosting, Random Forests (ADDED FEE)
Type: Professional Development
Date/Time: Wednesday, August 2, 2017 : 10:00 AM to 11:45 AM
Sponsor: ASA
Abstract #325500
Title: Improve your Regression with Modern Regression Analysis Techniques: linear, logistic, nonlinear, regularized, GPS, LARS, LASSO, Elastic Net, MARS, TreeNet Gradient Boosting, Random Forests (ADDED FEE)
Author(s): Mikhail Golovnya* and Charles Harrison* and Dan Steinberg*
Companies: Salford Systems and Salford Systems and Salford Systems
Keywords:
Abstract:

Linear regression plays a big part in the everyday life of a data analyst, but the results aren't always satisfactory. What if you could drastically improve prediction accuracy in your regression with a new model that handles missing values, interactions, AND nonlinearities in your data? Instead of proceeding with a mediocre analysis, join us for this presentation, which will show you how Modern Regression Analysis Techniques can take your regression model to the next level and expertly handle your modeling woes. Using real-world data sets we will demonstrate advances in nonlinear, regularized linear and logistic regression. This workshop will introduce the main concepts behind Leo Breiman's Random Forests AND Jerome Friedman's GPS (Generalized Path Seeker), MARS (Multivariate Adaptive Regression Splines) and Gradient Boosting. With these state-of-the-art techniques, you'll boost model performance without stumbling over confusing coefficients or problematic p-values! Prerequisites: basic knowledge of classical and logistic regression is recommended.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association