Abstract
The use of batch means is a well known technique for estimating the variance of point estimators computed from simulation experiments. The batch means variance estimator is simply the (appropriately scaled) sample variance of the estimator computed on subsets of consecutive observations. For the method to be practical, a good choice of batch length is necessary. We propose a method to estimate the optimal batch length using only the observed data. In contrast to results that model the unknown underlying dependence structure in terms of a few unknown parameters (e.g., autoregression), the method is completely model-free. The proposed algorithm makes use of previous asymptotic results giving the order of batch length as a function of the simulation length in order to calibrate an empirical estimate of batch length for a shorter simulation length. We describe the algorithm, present three numerical studies that demonstrate its efficacy, and discuss large-sample consistency.
Get full access to this article
View all access options for this article.
