Abstract Details
Activity Number:
|
76
|
Type:
|
Contributed
|
Date/Time:
|
Sunday, August 3, 2014 : 4:00 PM to 5:50 PM
|
Sponsor:
|
Section on Nonparametric Statistics
|
Abstract #312698
|
|
Title:
|
Kernel-based Kullback-Leibler Divergence on Nonparametric Density Alternatives
|
Author(s):
|
Han Yu*+
|
Companies:
|
|
Keywords:
|
Goodness-of-Fit ;
density estimator ;
Kullback-Leibler ;
kernel smoothing
|
Abstract:
|
A kernel-based Kullback-Leibler divergence is proposed. The proposed Kullback-Leibler divergence are used for tests on nonparametric density alternatives that are developed to be asymptotically distribution-free. The procedure can be viewed as a nonparametric extension of the traditional parametric likelihood ratio tests. Simulations of the proposed tests are provided for the small sample size performance.
|
Authors who are presenting talks have a * after their name.
Back to the full JSM 2014 program
|
2014 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Professional Development program, please contact the Education Department.
The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Copyright © American Statistical Association.