Shrinkage of Variance for Minimum Distance Based Tests
This paper promotes information theoretic inference in the context of minimum distance estimation. Various score test statistics differ only through the embedded estimator of the variance of estimating functions. We resort to implied probabilities provided by the constrained maximization of generalized entropy to get a more accurate variance estimator under the null. We document, both by theoretical higher order expansions and by Monte-Carlo evidence, that our improved score tests have better finite-sample size properties. The competitiveness of our non-simulation based method with respect to bootstrap is confirmed in the example of inference on covariance structures previously studied by Horowitz (1998).