The estimation of the (auto- and) cross-covariance matrices of respectively a stationary random process plays a central role in prediction theory and time series analysis. In the univariate framework, we proposed an estimator based on regularizing the sample partial autocorrelation function, via a modified Durbin-Levinson algorithm that receives as an input the banded and tapered sample partial autocorrelations and returns a consistent and positive definite estimator of the autocovariance matrix. We discuss multivariate generalizations, based on a regularized Whittle algorithm, shrinking the lag structure towards a finite order vector autoregressive system (by penalizing the partial canononical correlations), on the one hand, and shrinking the cross-sectional covariance towards a diagonal target, on the other. As the shrinkage intensity increases, the multivariate system converges to a set of unrelated univariate processes. We illustrate the merits of the proposal with respect to the problem of out of sample prediction and the estimation of the spectral density of high-dimensional time series.