Page 4 of 4

Re: Recursive VECM - Johansen ML technique

Posted: Wed Mar 20, 2024 3:40 am
by ac_1
TomDoan wrote: Mon Mar 18, 2024 9:23 am
ac_1 wrote: Mon Mar 18, 2024 2:59 am (ii) To aid my understanding, given a VAR-in-levels with DET=NONE or DET=CONSTANT I can manually calculate the PI matrix

PI = - (I - GAMMA(1) - GAMMA(2) - ... - GAMMA(p))
where
I is the identity matrix
GAMMA's are the matrices of parameters up to the AR(pth) lag.

How to 'by-hand' calculate PI including the deterministic terms: DET=RC, DET=TREND, and SEASONAL?
DET=RC and DET=RTREND are the only ones that are more complicated, and that is only because they anticipate a restriction on the intercept (or trend). The adjusted "PI" matrix in rearranging the unrestricted VAR simply adds a column of the estimates on the constant (DET=RC) and trend term (DET=RTREND).
Probably easier to include
comp PI = loadings*tr(vectors))
to calculate PI in JohMLE.src

From, https://www.estima.com/ratshelp/index.h ... edure.html
DET=NONE/[CONSTANT]/TREND/RC, should be RTREND to get the additional column of estimates, although @JOHMLE accepts TREND as-well?


(vii) If I run my acmtsICRS.src procedure for plotting of the inverse roots.

I get an error

Dimension Companion matrix
## MAT1. Matrix COMPANION Has Not Been Dimensioned

Here

Code: Select all

* AR roots
if (mts == 0)
   compute companion=%modelcompanion(mtsmodel)
   eigen(cvalues=mtsroots) companion
if (mts == 1)
   compute companion=%modelcompanion(%modelsubstect(mtsmodel))
   eigen(cvalues=mtsroots) companion
acmtsICRS.src is to be used for any VAR and VECM.

How to declare the dimension of the Matrix beforehand?

Re: Recursive VECM - Johansen ML technique

Posted: Wed Mar 20, 2024 9:37 am
by TomDoan
ac_1 wrote: Wed Mar 20, 2024 3:40 am (vii) If I run my acmtsICRS.src procedure for plotting of the inverse roots.

I get an error

Dimension Companion matrix
## MAT1. Matrix COMPANION Has Not Been Dimensioned

Here

Code: Select all

* AR roots
if (mts == 0)
   compute companion=%modelcompanion(mtsmodel)
   eigen(cvalues=mtsroots) companion
if (mts == 1)
   compute companion=%modelcompanion(%modelsubstect(mtsmodel))
   eigen(cvalues=mtsroots) companion
acmtsICRS.src is to be used for any VAR and VECM.

How to declare the dimension of the Matrix beforehand?
That's not how IF's work in RATS. You need braces around the controlled instructions.

Re: Recursive VECM - Johansen ML technique

Posted: Wed Mar 20, 2024 4:21 pm
by ac_1
TomDoan wrote: Wed Mar 20, 2024 9:37 am
ac_1 wrote: Wed Mar 20, 2024 3:40 am (vii) If I run my acmtsICRS.src procedure for plotting of the inverse roots.

I get an error

Dimension Companion matrix
## MAT1. Matrix COMPANION Has Not Been Dimensioned

Here

Code: Select all

* AR roots
if (mts == 0)
   compute companion=%modelcompanion(mtsmodel)
   eigen(cvalues=mtsroots) companion
if (mts == 1)
   compute companion=%modelcompanion(%modelsubstect(mtsmodel))
   eigen(cvalues=mtsroots) companion
acmtsICRS.src is to be used for any VAR and VECM.

How to declare the dimension of the Matrix beforehand?
That's not how IF's work in RATS. You need braces around the controlled instructions.

Code: Select all

* AR roots
if (mts == 0) {
   compute companion=%modelcompanion(mtsmodel)
   eigen(cvalues=mtsroots) companion }
if (mts == 1) {
   compute companion=%modelcompanion(%modelsubstect(mtsmodel))
   eigen(cvalues=mtsroots) companion }
I get the following error

## SX22. Expected Type VECTOR[REAL], Got INTEGER Instead
>>>>sroots) companion }<<<<

Re: Recursive VECM - Johansen ML technique

Posted: Wed Mar 20, 2024 7:21 pm
by TomDoan
Put the close } on a new line.

Re: Recursive VECM - Johansen ML technique

Posted: Thu Mar 21, 2024 2:57 am
by ac_1
TomDoan wrote: Wed Mar 20, 2024 7:21 pm Put the close } on a new line.
:)

Re: Recursive VECM - Johansen ML technique

Posted: Thu Mar 21, 2024 12:47 pm
by TomDoan
ac_1 wrote: Wed Mar 20, 2024 3:40 am From, https://www.estima.com/ratshelp/index.h ... edure.html
DET=NONE/[CONSTANT]/TREND/RC, should be RTREND to get the additional column of estimates, although @JOHMLE accepts TREND as-well?
We've updated that, but note well that DET=TREND means that 1 and t are included in the regressions which, in the presence of unit roots, will create a quadratic trend. Needless to say, there isn't much call for that. Trending data are usually handled reasonably with DET=CONSTANT. DET=RTREND is really only useful when there are trending data, but it's not clear if the series actually have unit roots. (Unit roots plus DET=CONSTANT creates trends, but stationary data will not).

Re: Recursive VECM - Johansen ML technique

Posted: Fri Mar 22, 2024 3:22 am
by ac_1
TomDoan wrote: Thu Mar 21, 2024 12:47 pm
ac_1 wrote: Wed Mar 20, 2024 3:40 am From, https://www.estima.com/ratshelp/index.h ... edure.html
DET=NONE/[CONSTANT]/TREND/RC, should be RTREND to get the additional column of estimates, although @JOHMLE accepts TREND as-well?
We've updated that, but note well that DET=TREND means that 1 and t are included in the regressions which, in the presence of unit roots, will create a quadratic trend. Needless to say, there isn't much call for that. Trending data are usually handled reasonably with DET=CONSTANT. DET=RTREND is really only useful when there are trending data, but it's not clear if the series actually have unit roots. (Unit roots plus DET=CONSTANT creates trends, but stationary data will not).

Thanks! You may want to update the Output https://www.estima.com/ratshelp/index.h ... edure.html as these are the results with RATS 10.00 version c.

The Rank 0 row is for the differenced model

Code: Select all

Likelihood Based Analysis of Cointegration
Variables:  FTBS3 FTB12 FCM7
Estimated from 1975:07 to 2001:06
Data Points 312 Lags 6 with Constant restricted to Cointegrating Vector

Unrestricted eigenvalues and -T log(1-lambda)
  Rank     EigVal  Lambda-max  Trace   Trace-95%  LogL
        0                                        20.2392
        1   0.0818    26.6264  41.2013   35.0700 33.5524
        2   0.0333    10.5567  14.5750   20.1600 38.8307
        3   0.0128     4.0183   4.0183    9.1400 40.8399

Cointegrating Vector for Largest Eigenvalue
FTBS3     FTB12    FCM7       Constant
-3.154123 3.132882  -0.321838 0.619010

Harris, R (1995) Using Cointegration Analysis in Econometric Modelling, Pearson Educaton Limited, Prentice Hall,

is a good book which uses CATS, amongst other packages (I would like CATS 2.0 at sometime in the future).

Re: Recursive VECM - Johansen ML technique

Posted: Fri Mar 22, 2024 7:17 am
by TomDoan
We changed the format of the output to match the general CATS rank test statistics. We can't change help files on older software.

Re: Recursive VECM - Johansen ML technique

Posted: Tue Mar 26, 2024 6:04 am
by ac_1
To complement individual variable residual plots as in ARIMA models (i) time plot (ii) ACF (iii) histogram with normal density (iv) qplot (normal); regarding multivariate residual tests (as a 'generic' battery) for a VAR and VECM (e.g. 3 variable) are these exhaustive and with the correct options/parameters?

Code: Select all

*===============================
CROSS(qstats,org=columns,from=-8,to=8) resids(1) resids(2)
@CROSSCORR(number=20) resids(1) resids(2)

CROSS(qstats,org=columns,from=-8,to=8) resids(1) resids(3)
@CROSSCORR(number=20) resids(1) resids(3)

CROSS(qstats,org=columns,from=-8,to=8) resids(2) resids(3)
@CROSSCORR(number=20) resids(2) resids(3)


* multivariate residual heteroskedasticity test
* https://estima.com/ratshelp/index.html?archtestprocedure.html
* H0: is that the series are mean zero, not serially correlated and with a fixed covariance matrix
* p = 0 reject the null; p >= 0.05 cannot reject the null
@mvarchtest(lags=5)
# resids

* multivariate residual Hosking(1981) variant of the multivariate Q statistic
* https://estima.com/ratshelp/index.html?mvjbprocedure.html
* H0: No serial correlation
* p = 0 reject the null; p >= 0.05 cannot reject the null
comp k=4; comp plag=4
@mvqstat(dfc=(plag*k^2),lags=12)
# resids

* multivariate residual stability test in Nyblom(1989) for a complete covariance matrix
* https://estima.com/ratshelp/index.html?mvjbprocedure.html
* H0: null is that the covariance matrix of the series is constant vs an alternative of a structural break at some point in the sample.
* p = 0 reject the null; p >= 0.05 cannot reject the null
@cvstabtest
# resids

* multivariate residual normality test
* https://estima.com/ratshelp/index.html?mvjbprocedure.html
* H0: The null hypothesis is a joint hypothesis of the skewness being zero and the excess kurtosis being zero.
* p = 0 reject the null; p >= 0.05 cannot reject the null
@mvjb(sigma=%sigma)
# resids

Re: Recursive VECM - Johansen ML technique

Posted: Tue Mar 26, 2024 12:46 pm
by TomDoan
Exhaustive is a bit strong. Those look fine.

Re: Recursive VECM - Johansen ML technique

Posted: Fri Apr 05, 2024 3:11 am
by ac_1
(a) In a cointegrated VECM for I(1) variables:
- Is it worth plotting the IRF's as they are not mean reverting, as per unit roots relative to the complex unit circle?
- What about variance decompositions?

(b) I'd like to delve deeper into the 5 determininstic components in @JOHMLE
CASE 1: None
CASE 2: Constant
CASE 3: Trend
CASE 4: Restricted constant
CASE 5: Restricted trend

To aid my understanding, I can manually calculate the PI matrix from the levels estimation for CASE 1 and CASE 2 --- Constant calculation here, but not for the others. How? Although the PI matrix is formed from the residual product moment matrices, Box 5.1, Harris (1995).

Code: Select all

*===============================
* VAR (lag=1 to 6, DET=CONSTANT)
* ---
system(model=ratemodel)
variables ftbs3 ftb12 fcm7
lags 1 to 6
det constant
end(system)
estimate(residuals=resids,noftests) 1975:07 2001:06-nsteps

disp %BETASYS
disp %VARLAGSUMS; * -PI matrix

dec rectangular ID A1 A2 A3 A4 A5 A6
dim ID(3,3) A1(3,3) A2(3,3) A3(3,3) A4(3,3) A5(3,3) A6(3,3)


disp %identity(3)

comp ID = || -1.0, 0.0, 0.0 | $
             0.0, -1.0, 0.0 | $
             0.0, 0.0, -1.0 ||
disp ID

comp A1 = || %BETASYS(1), %BETASYS(7), %BETASYS(13) | $
             %BETASYS(20), %BETASYS(26), %BETASYS(32) | $
             %BETASYS(39), %BETASYS(45), %BETASYS(51) ||
disp A1

comp A2 = || %BETASYS(2), %BETASYS(8), %BETASYS(14) | $
             %BETASYS(21), %BETASYS(27), %BETASYS(33) | $
             %BETASYS(40), %BETASYS(46), %BETASYS(52) ||
disp A2

comp A3 = || %BETASYS(3), %BETASYS(9), %BETASYS(15) | $
             %BETASYS(22), %BETASYS(28), %BETASYS(34) | $
             %BETASYS(41), %BETASYS(47), %BETASYS(53) ||
disp A3

comp A4 = || %BETASYS(4), %BETASYS(10), %BETASYS(16) | $
             %BETASYS(23), %BETASYS(29), %BETASYS(35) | $
             %BETASYS(42), %BETASYS(48), %BETASYS(54) ||
disp A4

comp A5 = || %BETASYS(5), %BETASYS(11), %BETASYS(17) | $
             %BETASYS(24), %BETASYS(30), %BETASYS(36) | $
             %BETASYS(43), %BETASYS(49), %BETASYS(55) ||
disp A5

comp A6 = || %BETASYS(6), %BETASYS(12), %BETASYS(18) | $
             %BETASYS(25), %BETASYS(31), %BETASYS(37) | $
             %BETASYS(44), %BETASYS(50), %BETASYS(56) ||
disp A6

comp PIVAR = ID + A1 + A2 + A3 + A4 + A5 + A6; disp 'PIVAR:' PIVAR

Re: Recursive VECM - Johansen ML technique

Posted: Fri Apr 05, 2024 7:55 am
by TomDoan
ac_1 wrote: Fri Apr 05, 2024 3:11 am (a) In a cointegrated VECM for I(1) variables:
- Is it worth plotting the IRF's as they are not mean reverting, as per unit roots relative to the complex unit circle?
- What about variance decompositions?
Yes and yes. While the variances diverge, the decomposition converges. See, for instance, the Hasbrouck paper, which does a long-run variance decomposition.
ac_1 wrote: Fri Apr 05, 2024 3:11 am (b) I'd like to delve deeper into the 5 determininstic components in @JOHMLE
CASE 1: None
CASE 2: Constant
CASE 3: Trend
CASE 4: Restricted constant
CASE 5: Restricted trend

To aid my understanding, I can manually calculate the PI matrix from the levels estimation for CASE 1 and CASE 2 --- Constant calculation here, but not for the others. How? Although the PI matrix is formed from the residual product moment matrices, Box 5.1, Harris (1995).

Code: Select all

*===============================
* VAR (lag=1 to 6, DET=CONSTANT)
* ---
system(model=ratemodel)
variables ftbs3 ftb12 fcm7
lags 1 to 6
det constant
end(system)
estimate(residuals=resids,noftests) 1975:07 2001:06-nsteps

disp %BETASYS
disp %VARLAGSUMS; * -PI matrix

dec rectangular ID A1 A2 A3 A4 A5 A6
dim ID(3,3) A1(3,3) A2(3,3) A3(3,3) A4(3,3) A5(3,3) A6(3,3)


disp %identity(3)

comp ID = || -1.0, 0.0, 0.0 | $
             0.0, -1.0, 0.0 | $
             0.0, 0.0, -1.0 ||
disp ID

comp A1 = || %BETASYS(1), %BETASYS(7), %BETASYS(13) | $
             %BETASYS(20), %BETASYS(26), %BETASYS(32) | $
             %BETASYS(39), %BETASYS(45), %BETASYS(51) ||
disp A1

comp A2 = || %BETASYS(2), %BETASYS(8), %BETASYS(14) | $
             %BETASYS(21), %BETASYS(27), %BETASYS(33) | $
             %BETASYS(40), %BETASYS(46), %BETASYS(52) ||
disp A2

comp A3 = || %BETASYS(3), %BETASYS(9), %BETASYS(15) | $
             %BETASYS(22), %BETASYS(28), %BETASYS(34) | $
             %BETASYS(41), %BETASYS(47), %BETASYS(53) ||
disp A3

comp A4 = || %BETASYS(4), %BETASYS(10), %BETASYS(16) | $
             %BETASYS(23), %BETASYS(29), %BETASYS(35) | $
             %BETASYS(42), %BETASYS(48), %BETASYS(54) ||
disp A4

comp A5 = || %BETASYS(5), %BETASYS(11), %BETASYS(17) | $
             %BETASYS(24), %BETASYS(30), %BETASYS(36) | $
             %BETASYS(43), %BETASYS(49), %BETASYS(55) ||
disp A5

comp A6 = || %BETASYS(6), %BETASYS(12), %BETASYS(18) | $
             %BETASYS(25), %BETASYS(31), %BETASYS(37) | $
             %BETASYS(44), %BETASYS(50), %BETASYS(56) ||
disp A6

comp PIVAR = ID + A1 + A2 + A3 + A4 + A5 + A6; disp 'PIVAR:' PIVAR
Those have non-standard distributions so you can't use IC's to choose among them. I don't even know how much work anyone has done on comparisons among those---they are inextricably linked to the rank of cointegration. (For instance, DET=RC has only r "constants", rather than a full set). Because of that, you can only back out the estimates in certain cases. Note that CATS assumes the DET is known before going about the analysis.