Abstract
In this article we consider the unit root test based on the ordinary least squares (OLSs) estimator for a coefficient of a lagged dependent variable when the error terms are serially correlated. Using Imhof's (Citation1961) method, we show how we numerically evaluate the exact distribution function of the unit root test when the error terms are serially correlated. Our numerical results show that when the error terms are serially correlated, the size distortion is not small even if the sample size is considerably large. Also, based on the distribution function, we evaluate numerically exact critical values when the sample size is small and moderate.