Hasbrouck(1995) Information Shares

Questions and discussions on Vector Autoregressions
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Hasbrouck(1995) Information Shares

Unread post by TomDoan »

This was based upon some posts from the RATS Discussion List.
Is there rats program for Hasbrouck(1995) "One security, many markets: determining the contribution to price discovery", Journal of Finance?
If CSTAR=C(1) is the matrix of long-run multipliers for Y(t)=C(L)e(t) and F is a factor of the covariance matrix for the e's, then the long-run covariance matrix is

cstar*f*tr(f)*tr(cstar)

You can decompose the long-run variance into the fractions due to each of the orthogonal components with the following:

Code: Select all

compute split=cstar*f
compute contrib=split.^2
compute [vector] variance=contrib*%fill(%cols(contrib),1,1.0)
ewise contrib(i,j)=contrib(i,j)/variance(i)
Row i, column j will be the fraction of the long-run variance in i due to orthogonal shock j.

A full example is provided by hasbrouck.rpf. This does the calculation with both an empirically chosen cointegrating vector and also the theoretical cointegrating space (which has a single common trend).
w-tb3n6ms.txt
Data file for example
(62.86 KiB) Downloaded 1920 times
jeanne
Posts: 11
Joined: Tue Jul 23, 2013 6:42 am

Re: Hasbrouck(1995) Information Shares

Unread post by jeanne »

Dear all,

Is there any chance I could use this code/ the Hasbrouck method for more than two variables? I want to examine one spot and two futures markets and find out where price discovery takes place.

Thank you very much!

Best,
Jeanne
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: Hasbrouck(1995) Information Shares

Unread post by TomDoan »

jeanne wrote:Dear all,

Is there any chance I could use this code/ the Hasbrouck method for more than two variables? I want to examine one spot and two futures markets and find out where price discovery takes place.

Thank you very much!

Best,
Jeanne
This actually does five, though the change to make it three is fairly clear. While not the ideal data set, this gets five annual yields from five different bond prices. The analysis starts once y1,...,y5 are constructed---in general, those would be log asset prices.

Code: Select all

open data bondprice.prn
calendar(m) 1952:6
data(format=prn,org=columns) 1952:06 2003:12 qdate price1 price2 price3 price4 price5
*
* Normalize and convert to percentages
*
dofor s = price1 to price5
   set s = 100.0*log(s{0}/100.0)
end dofor s
*
* Compute percentage log yields per annum
*
set y1 = -price1
set y2 = -price2/2.0
set y3 = -price3/3.0
set y4 = -price4/4.0
set y5 = -price5/5.0
*
* The cointegrating vectors are any combination of four differences. The constant
* is restriction to the CV. Since the model has known cointegrating vectors, the
* intercepts have to be computed by taking the means of the  corresponding
* differences.
*
sstats(mean) / (y1-y2)>>mean12 (y1-y3)>>mean13 (y1-y4)>>mean14 (y1-y5)>>mean15
equation(coeffs=||1.0,-1.0,-mean12||) eq12
# y1 y2 constant
equation(coeffs=||1.0,-1.0,-mean13||) eq13
# y1 y3 constant 
equation(coeffs=||1.0,-1.0,-mean14||) eq14
# y1 y4 constant
equation(coeffs=||1.0,-1.0,-mean15||) eq15
# y1 y5 constant
*
system(model=vecm) 
variables y1 y2 y3 y4 y5
lags 1 to 2
ect eq12 eq13 eq14 eq15
end(system)
*
estimate
*
compute alphaperp=%perp(%vecmalpha)
compute psi=tr(alphaperp)
*
* Compute Choleski factorization of sigma
*
compute f=%decomp(%sigma)
*
* Compute decomposition of long-run variance
*
compute split=psi*f
compute contrib=split.^2/%dot(split,split)
*
disp "Decomposition of Long-run variance" contrib
bondprice.prn
Data file
(39.36 KiB) Downloaded 1769 times
jeanne
Posts: 11
Joined: Tue Jul 23, 2013 6:42 am

Re: Hasbrouck(1995) Information Shares

Unread post by jeanne »

I want to compute information shares for a data set of spot prices x and futures prices y.
First, I estimate this:

Code: Select all

    @johmle(lags=10,det=constant,vectors=direct,dual=dual)
    # x y  

    compute alphaperp=%xcol(dual,1)
    compute beta=%xcol(direct,1)
    compute betaperp=%perp(beta)
    compute cstar=betaperp*tr(alphaperp)
    compute f=%decomp(%sigma)
    compute split=cstar*f
    compute contrib=split.^2
    compute [vector] variance=contrib*%fill(%cols(contrib),1,1.0)
    ewise contrib(i,j)=contrib(i,j)/variance(i)
    disp "Decomposition of Long-run variance" contrib
... and get this result:

Likelihood Based Analysis of Cointegration
Variables: X Y
Estimated from 11 to 907
Data Points 897 Lags 10 with Constant

Unrestricted eigenvalues and -T log(1-lambda)
Rank EigVal Lambda-max Trace Trace-95% LogL
0 7855.9113
1 0.0190 17.2488 19.7802 15.4100 7864.5358
2 0.0028 2.5313 2.5313 3.8400 7865.8014

Cointegrating Vector for Largest Eigenvalue
X Y
-382.867863 375.388255

Decomposition of Long-run variance
0.00808 0.99192
0.00808 0.99192

However, when I use/ adjust your code for the five markets

Code: Select all

    sstats(mean) / (x-y)>>meanxy 
    equation(coeffs=||1.0,-1.0,-meanxy||) eqxy
    # x y constant


    system(model=vecm)
    variables x y 
    lags 1 to 10
    ect eqxy
    end(system)
    *
    estimate
    *
    compute alphaperp=%perp(%vecmalpha)
    compute psi=tr(alphaperp)
    *
    * Compute Choleski factorization of sigma
    *
    compute f=%decomp(%sigma)
    *
    * Compute decomposition of long-run variance
    *
    compute split=psi*f
    compute contrib=split.^2/%dot(split,split)
    *
    disp "Decomposition of Long-run variance" contrib

... I get this:

Decomposition of Long-run variance
0.94680 0.05320

Isn't that the exact opposite of the above?
Thank you very much!
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: Hasbrouck(1995) Information Shares

Unread post by TomDoan »

Thanks for pointing that out. I corrected the code for the first example (the one that used @JOHMLE).
jeanne
Posts: 11
Joined: Tue Jul 23, 2013 6:42 am

Re: Hasbrouck(1995) Information Shares

Unread post by jeanne »

Thank you very much!

Has anyone here tried to compute the modified information share model (MIS) proposed by Lien and Shrestha in the Journal of Futures Markets (2009) in RATS?

Any help will be greatly appreciated!
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: Hasbrouck(1995) Information Shares

Unread post by TomDoan »

Isn't that the same thing but with:

*
* Compute Cholesky factorization of sigma
*
compute f=%decomp(%sigma)

replaced with an eigenvalue-based factorization

eigen(dmatrix=identity) %sigma * f
leewoo
Posts: 2
Joined: Sun Sep 28, 2014 11:23 am

Hasbrouck(1995) Information Shares

Unread post by leewoo »

Hello. everyone.

I have a question about Hasbrouck(1995) Information Shares code.

Code: Select all

open data w-tb3n6ms.txt
calendar(w) 1958:12:12
data(format=free,org=columns) 1958:12:12 2004:8:6 tb3mo tb6mo
*
* With empirically determined cointegrating vector
*
@johmle(lags=5,det=constant,vectors=direct,dual=dual,loadings=ll)
#  tb6mo tb3mo
*
* Compute long-run multiplier for a single cointegrating vector
* Note: cstar omits a constant multiplier, but that has no effect
* on the calculations of proportions
*
compute alpha=%xcol(ll,1)
compute alphaperp=%perp(alpha)
compute beta=%xcol(direct,1)
compute betaperp=%perp(beta)
compute cstar=betaperp*tr(alphaperp)
*
* Compute Choleski factorization of sigma

compute f=%decomp(%sigma)
*
* Compute decomposition of long-run variance
compute split=cstar*f

compute contrib=split.^2
disp "contrib" contrib

compute var= cstar*f*tr(f)*tr(cstar)
disp "variance1" var

compute [vector] variance=contrib*%fill(%cols(contrib),1,1.0)
disp "variance2" variance

ewise contrib(i,j)=contrib(i,j)/variance(i)
disp "Decomposition of Long-run variance" contrib
and I display the variance

Code: Select all

compute var= cstar*f*tr(f)*tr(cstar)
disp "variance1" var

compute [vector] variance=contrib*%fill(%cols(contrib),1,1.0)
disp "variance2" variance
variance1
0.00759 0.00767
0.00767 0.00776

variance2 0.00759 0.00776

Then I get

Code: Select all

ewise contrib(i,j)=contrib(i,j)/variance(i)
disp "Decomposition of Long-run variance" contrib
Decomposition of Long-run variance
0.96255 0.03745
0.96255 0.03745

but It's different my calculation.

0.00731/0.00759=0.96310935 0.000284/0.00759=0.03744203
0.00747/0.00776=0.96262887 0.000291/0.00776=0.03744330

I attach hasbrouck equation.

am I wrong?
Attachments
w-tb3n6ms.txt
(62.86 KiB) Downloaded 1712 times
hasbrouck.jpg
hasbrouck.jpg (5.35 KiB) Viewed 83929 times
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: Hasbrouck(1995) Information Shares

Unread post by TomDoan »

What's wrong is comparing full precision calculations (the ones done with RATS) with calculations done with intermediate values rounded to 3 significant digits (the ones you did by hand). Do the DISPLAY instructions at greater precision with disp *.######## ... for instance and recompute and you'll see. Note that the percentages due to each shock have to be identical.

The two ways of calculating the variance: summing the rows of contrib (which is what the RATS code is doing), and taking the diagonals of the full covariance matrix (what you're doing) are identical.
sheriared
Posts: 6
Joined: Mon Jul 09, 2018 7:55 am

Re: Hasbrouck(1995) Information Shares

Unread post by sheriared »

Dear Tom,

Thank you very much for sharing your RATS code for Hasbrouck(1995) Information Shares. I am learning your code and I have got a question. I wonder if you could kindly enlighten me on this.

When you set y2 y3 y4 and y5, why do you divide them by 2, 3, 4, and 5, respectively?

Thank you very much!

Best
Liya
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: Hasbrouck(1995) Information Shares

Unread post by TomDoan »

The bond prices in the data set are the effective price of a zero-coupon bond of a particular maturity, so that's how you would compute the per annum yield of each.
sheriared
Posts: 6
Joined: Mon Jul 09, 2018 7:55 am

Re: Hasbrouck(1995) Information Shares

Unread post by sheriared »

Hi Tom,

Could you please tell me how to specify the optimal lag using @varlagselect function?

Code: Select all

OPEN DATA "C:\Users\lshenb\Documents\01-Work files\Research\A-study\Bitcoin\Price discovery\15min\S_BTC_XBT15.xlsx"
DATA(FORMAT=xlsx,ORG=COLUMNS) 2 6882 Spot BTC XBT

*note that the fisr entry of Spot, BTC and XBT are empty.


LOG  Spot / y1
LOG  BTC / y2
LOG  XBT / y3


@varlagselect(crit=AIC,lags=15) 502 1001
# y1 y2


dofor i = 1 to 13

*sstats(mean) (i-1)*500+2 (i-1)*500+501 (y1-y2)>>mean12 (y1-y3)>>mean13
sstats(mean) (i-1)*500+2 (i-1)*500+501  (y1-y3)>>mean13
*equation(coeffs=||1.0,-1.0,-mean12||) eq12
*# y1 y2 constant
equation(coeffs=||1.0,-1.0,-mean13||) eq13
# y1 y3 constant

system(model=vecm)
variables y1  y3
lags 1 to 20
ect eq13
end(system)
*
estimate(noprint) (i-1)*500+2 (i-1)*500+501
*
compute alphaperp=%perp(%vecmalpha)
compute psi=tr(alphaperp)
*
* Compute Choleski factorization of sigma
*
compute f=%decomp(%sigma)
*
* Compute decomposition of long-run variance
*
compute split=psi*f
compute contrib=split.^2/%dot(split,split)
display contrib
end dofor i
Many thanks indeed!
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: Hasbrouck(1995) Information Shares

Unread post by TomDoan »

@VARLagSelect sets %%AUTOP to the lag length selected using the chosen criterion.
sheriared
Posts: 6
Joined: Mon Jul 09, 2018 7:55 am

Re: Hasbrouck(1995) Information Shares

Unread post by sheriared »

Hi Tom,

Could you please explain why betaperp is not needed in the example for bondprices with 5 variables?

Many thanks!
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: Hasbrouck(1995) Information Shares

Unread post by TomDoan »

Beta-perp in that case is just a vector of ones.
Post Reply