Online Program Home
  My Program

Abstract Details

Activity Number: 436 - Modern Change Point Problems
Type: Invited
Date/Time: Wednesday, August 2, 2017 : 8:30 AM to 10:20 AM
Sponsor: IMS
Abstract #322078
Title: Total Variation Classes Beyond 1d: Minimax Rates, and the Limitations of Linear Smoothers
Author(s): Veeranjaneyulu Sadhanala and Yu-Xiang Wang and Ryan Tibshirani*
Companies: Carnegie Mellon University and Carnegie Mellon University and Carnegie Mellon University
Keywords: Fused lasso ; Total variation ; Image denoising ; Minimax rate ; Minimax linear rate
Abstract:

We consider the problem of estimating a function defined over n locations on a d-dimensional grid (having all side lengths equal to n^{1/d}). When the function is constrained to have discrete total variation upper bounded by a sequence C_n, we derive the minimax optimal squared \ell_2 estimation error rate, parametrized by n and C_n. Total variation denoising, also known as the fused lasso, is seen to be rate optimal. Several simpler estimators exist, such as Laplacian smoothing and Laplacian eigenmaps. A natural question is: can these simpler estimators perform just as well? We prove that these estimators, and more broadly all estimators given by linear transformations of the input data, are suboptimal over the class of functions with bounded variation. This extends fundamental findings of Donoho and Johnstone [1998] on 1-dimensional total variation spaces to higher dimensions. The implication is that the computationally simpler methods cannot be used for such sophisticated denoising tasks, without sacrificing statistical accuracy.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association