Abstract:
|
We study minimax convergence rates of nonparametric density estimation under a large class of loss functions called adversarial losses, which, besides classical Lp losses, include maximum mean discrepancy (MMD), Wasserstein distance, total variation distance, and Kolmorogov-Smirnov distance. In a general framework, we study how the choice of loss and the assumed smoothness of the underlying density together determine the minimax rate. Adversarial losses are also closely related to the losses encoded by discriminator networks in generative adversarial networks (GANs), and we discuss implications for training and evaluating GANs.
|