Abstract:
|
In our work, we propose a bootstrap procedure for classification problems, taking into account the existence of possibly irrelevant feature variables. Different weights, which may reflect the relative importance, are assigned to feature variables respectively. In the bagging stage, not only the observations are resampled, the feature variables are bootstrapped according to their assigned weights as well. Meanwhile, a bagged version likelihood funtion is constructed as the criterion to select variables. We study the asymptotic properties of this procedure by considering logistic regression as our classifier. The above procedure can be further extended to unsupervised learning problems and even a sequential one.
|