In this paper, we propose a new empirical Bayesian inference and variable selection method for high-dimensional linear models. The chief novelty is our use of a correlation-adaptive prior that makes use of information in the observed predictor variable matrix to adaptively address high collinearity, determining if parameters associated with correlated predictors should be shrunk together or kept apart. We also investigate asymptotic posterior convergence rate properties for our method. A simplified version of shotgun stochastic search algorithm is employed to implement the variable selection procedure, and the performance of our method is assessed against other existing methods, such as lasso, via simulation studies across different experimental settings and a real-data problem. In both real- and simulated-data examples, our method demonstrates significant advantage compared with the other methods.