Abstract Details
Activity Number:
|
190
|
Type:
|
Contributed
|
Date/Time:
|
Monday, August 5, 2013 : 10:30 AM to 12:20 PM
|
Sponsor:
|
Section on Statistical Learning and Data Mining
|
Abstract - #310199 |
Title:
|
Variable Length Markov Chains for Sequential Prediction in Dependence Time Series
|
Author(s):
|
Abraham J Wyner and Joshua Magarick*+
|
Companies:
|
The Wharton School, University of Pennsylvania and University of Pennsylvania
|
Keywords:
|
Variable Length Markov Chains ;
Dimension Reduction ;
Sequential Prediction ;
Machine Learning ;
Time Series ;
Classification
|
Abstract:
|
Variable Length Markov Chains (VLMCs) have proved useful for parsimonious models of discrete sequential data where the next state distribution can depend on a large number of lagged values. Their name, however, belies their lack of a Markov property. As such, they cannot replace higher order Markov models when this property is required, such as representing the hidden states in Hidden Markov Models. In this paper, we develop an extension to VLMCs which imbues them with the Markov property but retains their parsimony. This permits their use when high dimensional models are computationally infeasible and allows us to represent discrete time series data as a sequence of variable order states which can be seen as coming from a first order Markov chain. We then demonstrate their use in modeling the sleep states of mice and combine them with non-sequential prediction methods to augment the prediction from observed covariates alone.
|
Authors who are presenting talks have a * after their name.
Back to the full JSM 2013 program
|
2013 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Continuing Education program, please contact the Education Department.
The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Copyright © American Statistical Association.