Abstract:
|
When implementing the multitaper method for spectrum estimation of time series, the appropriate choice of the bandwidth parameter is very important in order to resolve fine scale details, especially when only a small sample is available. In this talk we give a method for determining the optimal bandwidth based on a mean squared error criterion. This method has two components: within-band bias and variance. When the true spectrum has a Taylor series expansion, one can express within-band bias as a function of the curvature of the spectrum, which can be estimated using a simple spline approximation. The variance estimate is obtained by jackknifing over individual spectrum estimates. We give a simple simulation to illustrate the method.
|