Author(s): David Tufte
Correct lag length selection is a common problem in applying time series analysis to economics and finance. Misspecification of the true trend process is also a problem that may arise in any work with time series data, since the alternatives are nearly observationally equivalent in finite samples. This study compares the performance of lag selection methodologies under both correct and mildly incorrect specifications of a univariate stochastic trend. This situation is exceptionally common for applied researchers: for example, lag length selection is required before a unit root test can be run or a conclusion even drawn from it. Most applied researchers rely on a selection methodology to justify their choice. In our simulations, lag selection criteria perform best when a correct unit root restriction is not imposed, and indeed when the specification is mildly incorrect.