$\begingroup$

Ledoit and Wolf ("A Well-Conditioned Estimator for Large-Dimensional Covariance Matrices", 2004) proposed an estimator for the covariance matrix of a data set, $S^* = p I_d + (1 - p) \hat{S}$ with $p \in (0,1)$ (they gave a specific $p$ that I won't repeat), $I_d$ the identity matrix, and $\hat{S}$ the sample covariance matrix. I believe this is a fine estimator for uncorrelated data, and I want to use it for data that's potentially correlated. I like this specific approach since once one has $\hat{S}$ the adjustment is easy and fast to compute; for my purposes speed matters.

I want an estimator of this type when $\hat{S}$ is replaced with a kernel-based covariance matrix estimator, particularly the HAC estimators mentioned in the classic Andrews (1991) and Newey-West (1994) papers. Of course I could just take the HAC estimators and repeat the process for obtaining $p$ that Ledoit and Wolf suggested, but I would like theoretical justification for doing so. I've tried to use the Web of Science to find papers that cite both Ledoit-Wolf (2004) and either Andrews (1991) or Newey-West (1994) but I found nine papers doing so and I don't think any of them are talking about what I want to do.

Before declaring that there is no reference justifying replacing $\hat{S}$ with a different estimator, I would like to know if anyone here has a reference suggestion, or at least an argument for why what I want to do is theoretically justified.