Hi,
I have a question regarding uradf.src procedure that does Augmented Dickey fuller tests. I opened up the procedure and noticed that when finding the optimal lag length using BIC it runs regressions on the level of the time series not the differece i.e. y(t)=c + y(t-1)+dy(t-1)+...+dy(t-maxlag).
Is it more efficient to choose optimal lags by regressing on levels instead of difference?
part of the procedure is below.
DO lagnum = 1,maxlag
LINREG(cmom,noprint) series #series{1} delseries{1 to lagnum}
COMPUTE aic(lagnum) = log(%rss/%nobs) + 2.*%nreg/%nobs
COMPUTE bic(lagnum) = log(%rss/%nobs) + (1.*%nreg/%nobs)*log(%nobs)
END DO
ADF test
Re: ADF test
The two are identical. The only difference is the coefficient on the lagged y, since is 1.0 higher when you run the regression with levels on the left.