GARCH Model with Day of Week Dummies

Discussions of ARCH, GARCH, and related models
jack
Posts: 160
Joined: Tue Sep 27, 2016 11:44 am

GARCH Model with Day of Week Dummies

Unread post by jack »

I attached the data. It is daily returns of stock market index. I don't know how to interoperate results and which estimation is correct.
Attachments
Tom.xlsx
(84.18 KiB) Downloaded 811 times
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: The diagnostics tests for multivariate GARCH

Unread post by TomDoan »

First of all, is it possible that your price data was an average over the day (rather than closing price)? The roughly .3 AR coefficient is consistent with returns from time-averaged data.

Second, your dummy is on for two days a week and off for three, so the "long-run" behavior is irrelevant since you will always be switching between the two states on a regular basis. Why are you expecting different behavior between those dates? Note that the way the dummy works for the variance is incremental so the expected behavior would be to cycle through the days of the week, with lowest being the second of the days with the dummy and the highest the last of the days without the dummy.
jack
Posts: 160
Joined: Tue Sep 27, 2016 11:44 am

Re: The diagnostics tests for multivariate GARCH

Unread post by jack »

You are truly intelligent to quickly understand the nature of the data and the task I want to perform with just a glance at the data. It's very exciting.

The data represents the closing prices, but in this market that I'm studying, we face a kind of daily price fluctuation ceiling for individual stock prices, hence the AR coefficient being 0.3.

What I want to do is examine whether the volatility is lower or not in the first two days. Those two days happen to be the days when foreign markets are closed. I want to investigate whether the domestic market's volatility on days when foreign markets are closed is less than usual or not.
jack
Posts: 160
Joined: Tue Sep 27, 2016 11:44 am

Re: The diagnostics tests for multivariate GARCH

Unread post by jack »

So, based on the data and the task I want to perform now, in your opinion, which distribution should I use for the data, and should I employ robust standard errors (even with t distribution)?
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: The diagnostics tests for multivariate GARCH

Unread post by TomDoan »

Why are such a high percentage of the data points true zeros? (About 8%).
jack
Posts: 160
Joined: Tue Sep 27, 2016 11:44 am

Re: The diagnostics tests for multivariate GARCH

Unread post by jack »

Zero figures represent official holidays when the previous day's price is used, resulting in a zero return.
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: The diagnostics tests for multivariate GARCH

Unread post by TomDoan »

You should take those data points out of the data set.

You should look at the Baillie and Bollerslev model. It's somewhat more complicated than you need, but it's designed to allow for the types of behavior that you seem to expect.

Do you have an explanation for the several enormous spikes in the data? A standard GARCH model can't really produce those.
jack
Posts: 160
Joined: Tue Sep 27, 2016 11:44 am

Re: The diagnostics tests for multivariate GARCH

Unread post by jack »

I appreciate your valuable assistance and guidance.

I read Baillie and Bollerslev model. Since I have a stock price index, I want to execute the code specifically for one country, namely the UK, and then proceed with my own data. Therefore, I made some changes to it. I aim to include the AR(1) component in the mean equation while excluding the dummy variable w81. I made the modifications and tested it on UK data; unfortunately, the software encounters an error. I have sent the code below. I would appreciate your guidance on identifying the error and how to resolve it. As always, thank you very much for your valuable assistance.

Code: Select all

*
* Replication file for Baillie and Bollerslev, "The Message in Daily
* Exchange Rates: A Conditional Variance Tale", JBES 1989, vol 7, pp
* 297-305
*
open data bbjbes1989.xls
data(format=xls,org=columns) 1 1245 uk dow
*
labels uk
# "U.K."
log uk

*
* Unit root tests
* Table 1
*
source ppunit.src
report(action=define)
dofor s = uk
   report(col=new,atrow=1) %l(s)
   @ppunit(lags=22,det=trend) s
   report(col=current,atrow=2) %cdstat
   @ppunit(lags=22) s
   report(col=current,atrow=3) %cdstat
end dofor
report(action=format,picture="*.###")
report(action=show)
*
* Table 2 estimates
*
report(action=define)
dofor s = uk
   report(col=new,atrow=1) %l(s)
   set dx = 100.0*(s{0}-s{1})
   linreg(noprint) dx
   # constant dx{1}
   report(col=curr,atrow=2) %beta(1)
   report(col=curr,atrow=3,special=parens) %stderrs(1)
   report(col=curr,atrow=4) %sigmasq
   report(col=curr,atrow=5) %logl
   set ustd = %resids/sqrt(%seesq)
   corr(print,qstats,number=15,method=yule) ustd
   report(col=curr,atrow=6) %qstat
   set usqr = ustd^2
   corr(print,qstats,number=15,method=yule) usqr
   report(col=curr,atrow=7) %qstat
   stats(noprint) %resids
   report(col=curr,atrow=8) %skewness
   report(col=curr,atrow=9) %kurtosis
end dofor s
report(action=format,picture="*.###",align=decimal)
report(action=show)
*
* Table 3
*
* The DOW series is 2 for Monday,...,6 for Friday. Create dummies for
* each day of the week.
*
dec vect[series] dd(5)
do i=1,5
   set dd(i) = dow==(i+1)
end do i
*
*
* SKIP is a dummy variable which will be 1 if and only if there is a
* skipped weekday.
*
set(first=1.0) skip = .not.(dow{1}+1==dow.or.(dow{1}==6.and.dow==2))
*
nonlin(parmset=meanparms) b0 b1 b2 b3 b4 b5 
nonlin(parmset=garchshifts) w1 w2 w3 w4 w5 w6 w7
nonlin(parmset=garchparms) alpha1 beta1 rnu
declare series uu h u
frml varf = alpha1*uu{1}+beta1*h{1}+$
   w1*dd(1)+w2*dd(2)+w3*dd(3)+w4*dd(4)+w5*dd(5)+w6*skip+w7*skip{1}-$
   (alpha1+beta1)*(w1*dd(1){1}+w2*dd(2){1}+w3*dd(3){1}+w4*dd(4){1}+w5*dd(5){1}+w6*skip{1}+w7*skip{2})
frml meanf = b0*dx{1}+b1*dd(1)+b2*dd(2)+b3*dd(3)+b4*dd(4)+b5*dd(5)
frml logl = (u=dx-meanf),(uu(t)=u^2),(h(t)=varf(t)),%logtdensity(h,u,1./rnu)
*
report(action=define,hlabels=||"","U.K."||)
dofor s = uk
   set dx = 100.0*(s{0}-s{1})
   *
   * Get preliminary guess values for the mean parameters off an OLS
   * regression on the mean dummies.
   *
   linreg(noprint) dx
   # dd dx{1}
   compute hinit=%seesq
   *
   set uu = %resids^2
   set h  = hinit
   set u  = %resids
   *
   * Poke those into the meanparms.
   *
   compute %parmspoke(meanparms,%beta)
   *
   * Get preliminary guess values for the dummy terms in the variance
   * equation by an OLS regression of the squared residuals on the
   * dummies.
   *
   linreg uu
   # dd skip skip{1}
   *
   * Poke those into the shift parameters
   *
   compute %parmspoke(garchshifts,%beta)
   *
   * Set initial guess values for the reciprocal degrees of freedom, and
   * the GARCH parameters.
   *
   compute rnu=.10
   compute alpha1=.1,beta1=.8
   *
   set test 3 * = logl(t)
   sstats 3 * .not.%valid(test)>>nmiss
   if nmiss>0.0
      compute w1=w2=w3=w4=w5=hinit,w6=w7=0.0
   maximize(parmset=meanparms+garchshifts+garchparms,pmethod=simplex,piter=10,method=bhhh) logl 3 *
   report(regressors,extra=stderrs)
   report(col=current,atrow=2*%nreg+2) %funcval
   stats u 3 *
   set ustd 3 * = u/sqrt(h)
   corr(print,qstats,number=15,method=yule) ustd 3 *
   report(col=current,atrow=2*%nreg+3) %qstat
   set usqr 3 * = ustd^2
   corr(print,qstats,number=15,method=yule) usqr 3 *
   report(col=current,atrow=2*%nreg+4) %qstat
   stats(noprint) ustd
   report(col=current,atrow=2*%nreg+5) %skewness
   report(col=current,atrow=2*%nreg+6) %kurtosis
   report(col=current,atrow=2*%nreg+7) 3.0*(1.0/rnu-2.0)/(1.0/rnu-4.0)
end dofor
report(action=show)
*
* Table 4
*
*
* Table 5
*
set pullweds = dow==4.or.dow==5.and.dow{1}==3
dofor s = uk
   sample(smpl=pullweds) s / wx
   clear dx
   set dx = 100.0*(wx-wx{1})
   garch(p=1,q=1) / dx
end dofor s
*
* Table 6
*
dofor s = uk
   sample(smpl=pullweds) s / wx
   sample(smpl=%clock(t,2)==1) wx / w2x
   clear dx
   set dx = 100.0*(w2x-w2x{1})
   garch(q=1) / dx
end dofor s
*
* Table 7
*
dofor s = uk
   sample(smpl=pullweds) s / wx
   sample(smpl=%clock(t,4)==1) wx / w4x
   clear dx
   set dx = 100.0*(w4x-w4x{1})
   garch(q=0,p=0) / dx
end dofor s

TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: The diagnostics tests for multivariate GARCH

Unread post by TomDoan »

You're dropping an extra data point because of the AR(1) mean model. All those 3's inside the estimation loop need to be 4 instead.
jack
Posts: 160
Joined: Tue Sep 27, 2016 11:44 am

Re: The diagnostics tests for multivariate GARCH

Unread post by jack »

I did it (replaced 3 with 4) but I got strange results for w1 to w5: they are just 15.262568 (164.976054)!

Code: Select all

*
* Replication file for Baillie and Bollerslev, "The Message in Daily
* Exchange Rates: A Conditional Variance Tale", JBES 1989, vol 7, pp
* 297-305
*
open data bbjbes1989.xls
data(format=xls,org=columns) 1 1245 uk dow
*
labels uk
# "U.K."
log uk

*
* Unit root tests
* Table 1
*
source ppunit.src
report(action=define)
dofor s = uk
   report(col=new,atrow=1) %l(s)
   @ppunit(lags=22,det=trend) s
   report(col=current,atrow=2) %cdstat
   @ppunit(lags=22) s
   report(col=current,atrow=3) %cdstat
end dofor
report(action=format,picture="*.###")
report(action=show)
*
* Table 2 estimates
*
report(action=define)
dofor s = uk
   report(col=new,atrow=1) %l(s)
   set dx = 100.0*(s{0}-s{1})
   linreg(noprint) dx
   # constant dx{1}
   report(col=curr,atrow=2) %beta(1)
   report(col=curr,atrow=3,special=parens) %stderrs(1)
   report(col=curr,atrow=4) %sigmasq
   report(col=curr,atrow=5) %logl
   set ustd = %resids/sqrt(%seesq)
   corr(print,qstats,number=15,method=yule) ustd
   report(col=curr,atrow=6) %qstat
   set usqr = ustd^2
   corr(print,qstats,number=15,method=yule) usqr
   report(col=curr,atrow=7) %qstat
   stats(noprint) %resids
   report(col=curr,atrow=8) %skewness
   report(col=curr,atrow=9) %kurtosis
end dofor s
report(action=format,picture="*.###",align=decimal)
report(action=show)
*
* Table 3
*
* The DOW series is 2 for Monday,...,6 for Friday. Create dummies for
* each day of the week.
*
dec vect[series] dd(5)
do i=1,5
   set dd(i) = dow==(i+1)
end do i
*
*
* SKIP is a dummy variable which will be 1 if and only if there is a
* skipped weekday.
*
set(first=1.0) skip = .not.(dow{1}+1==dow.or.(dow{1}==6.and.dow==2))
*
nonlin(parmset=meanparms) b0 b1 b2 b3 b4 b5
nonlin(parmset=garchshifts) w1 w2 w3 w4 w5 w6 w7
nonlin(parmset=garchparms) alpha1 beta1 rnu
declare series uu h u
frml varf = alpha1*uu{1}+beta1*h{1}+$
   w1*dd(1)+w2*dd(2)+w3*dd(3)+w4*dd(4)+w5*dd(5)+w6*skip+w7*skip{1}-$
   (alpha1+beta1)*(w1*dd(1){1}+w2*dd(2){1}+w3*dd(3){1}+w4*dd(4){1}+w5*dd(5){1}+w6*skip{1}+w7*skip{2})
frml meanf = b0*dx{1}+b1*dd(1)+b2*dd(2)+b3*dd(3)+b4*dd(4)+b5*dd(5)
frml logl = (u=dx-meanf),(uu(t)=u^2),(h(t)=varf(t)),%logtdensity(h,u,1./rnu)
*
report(action=define,hlabels=||"","U.K."||)
dofor s = uk
   set dx = 100.0*(s{0}-s{1})
   *
   * Get preliminary guess values for the mean parameters off an OLS
   * regression on the mean dummies.
   *
   linreg(noprint) dx
   # dd dx{1}
   compute hinit=%seesq
   *
   set uu = %resids^2
   set h  = hinit
   set u  = %resids
   *
   * Poke those into the meanparms.
   *
   compute %parmspoke(meanparms,%beta)
   *
   * Get preliminary guess values for the dummy terms in the variance
   * equation by an OLS regression of the squared residuals on the
   * dummies.
   *
   linreg uu
   # dd skip skip{1}
   *
   * Poke those into the shift parameters
   *
   compute %parmspoke(garchshifts,%beta)
   *
   * Set initial guess values for the reciprocal degrees of freedom, and
   * the GARCH parameters.
   *
   compute rnu=.10
   compute alpha1=.1,beta1=.8
   *
   set test 4 * = logl(t)
   sstats 4 * .not.%valid(test)>>nmiss
   if nmiss>0.0
      compute w1=w2=w3=w4=w5=hinit,w6=w7=0.0
   maximize(parmset=meanparms+garchshifts+garchparms,pmethod=simplex,piter=10,method=bhhh) logl 4 *
   report(regressors,extra=stderrs)
   report(col=current,atrow=2*%nreg+2) %funcval
   stats u 4 *
   set ustd 4  * = u/sqrt(h)
   corr(print,qstats,number=15,method=yule) ustd 4 *
   report(col=current,atrow=2*%nreg+3) %qstat
   set usqr 4 * = ustd^2
   corr(print,qstats,number=15,method=yule) usqr 4 *
   report(col=current,atrow=2*%nreg+4) %qstat
   stats(noprint) ustd
   report(col=current,atrow=2*%nreg+5) %skewness
   report(col=current,atrow=2*%nreg+6) %kurtosis
   report(col=current,atrow=2*%nreg+7) 3.0*(1.0/rnu-2.0)/(1.0/rnu-4.0)
end dofor
report(action=show)
*
* Table 4
*
*
* Table 5
*
set pullweds = dow==4.or.dow==5.and.dow{1}==3
dofor s = uk
   sample(smpl=pullweds) s / wx
   clear dx
   set dx = 100.0*(wx-wx{1})
   garch(p=1,q=1) / dx
end dofor s
*
* Table 6
*
dofor s = uk
   sample(smpl=pullweds) s / wx
   sample(smpl=%clock(t,2)==1) wx / w2x
   clear dx
   set dx = 100.0*(w2x-w2x{1})
   garch(q=1) / dx
end dofor s
*
* Table 7
*
dofor s = uk
   sample(smpl=pullweds) s / wx
   sample(smpl=%clock(t,4)==1) wx / w4x
   clear dx
   set dx = 100.0*(w4x-w4x{1})
   garch(q=0,p=0) / dx
end dofor s

TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: The diagnostics tests for multivariate GARCH

Unread post by TomDoan »

Looks fine:

Code: Select all

MAXIMIZE - Estimation by BHHH
Convergence in    10 Iterations. Final criterion was  0.0000095 <=  0.0000100

Usable Observations                      1242
Function Value                     -1165.2391

    Variable                        Coeff      Std Error      T-Stat      Signif
************************************************************************************
1.  B0                            0.013012346  0.028466351      0.45711  0.64758966
2.  B1                           -0.064075841  0.042711632     -1.50020  0.13356351
3.  B2                           -0.057499564  0.039170838     -1.46792  0.14212660
4.  B3                            0.026512784  0.034124198      0.77695  0.43718847
5.  B4                           -0.064163143  0.033148627     -1.93562  0.05291427
6.  B5                           -0.058204904  0.035690080     -1.63084  0.10292353
7.  W1                            0.480607704  0.088004104      5.46120  0.00000005
8.  W2                            0.456222675  0.083076870      5.49157  0.00000004
9.  W3                            0.374530021  0.077960194      4.80412  0.00000155
10. W4                            0.354630814  0.074017381      4.79118  0.00000166
11. W5                            0.400586574  0.076987014      5.20330  0.00000020
12. W6                            0.349708358  0.214236268      1.63235  0.10260599
13. W7                            0.217122833  0.165195760      1.31434  0.18873295
14. ALPHA1                        0.057109064  0.016432176      3.47544  0.00051001
15. BETA1                         0.917314515  0.024434478     37.54181  0.00000000
16. RNU                           0.111191440  0.021623818      5.14208  0.00000027
jack
Posts: 160
Joined: Tue Sep 27, 2016 11:44 am

Re: The diagnostics tests for multivariate GARCH

Unread post by jack »

Thanks.

Now, I use my own data, but it doesn't converge, unfortunately :(

I also removed some outliers (21 returns observations), but it still doesn't converge.
Attachments
Tom.xlsx
(57.87 KiB) Downloaded 801 times
jack
Posts: 160
Joined: Tue Sep 27, 2016 11:44 am

Re: The diagnostics tests for multivariate GARCH

Unread post by jack »

Dear Tom,
I think, as always, I need your help. I've tried almost everything I could and anything that came to my mind, but I haven't achieved any results.
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: The diagnostics tests for multivariate GARCH

Unread post by TomDoan »

You never provided the program that you used for your data.
jack
Posts: 160
Joined: Tue Sep 27, 2016 11:44 am

The diagnostics tests for multivariate GARCH

Unread post by jack »

Here is the program and attached the data:

Code: Select all

*
* Replication file for Baillie and Bollerslev, "The Message in Daily
* Exchange Rates: A Conditional Variance Tale", JBES 1989, vol 7, pp
* 297-305
*
open data tom1.xlsx
data(format=xlsx,org=columns) 1 3255 sto dow
*
labels sto
# "Stock"
log sto

*
* Unit root tests
* Table 1
*
source ppunit.src
report(action=define)
dofor s = sto
   report(col=new,atrow=1) %l(s)
   @ppunit(lags=22,det=trend) s
   report(col=current,atrow=2) %cdstat
   @ppunit(lags=22) s
   report(col=current,atrow=3) %cdstat
end dofor
report(action=format,picture="*.###")
report(action=show)
*
* Table 2 estimates
*
report(action=define)
dofor s = sto
   report(col=new,atrow=1) %l(s)
   set dx = 100.0*(s{0}-s{1})
   linreg(noprint) dx
   # constant dx{1}
   report(col=curr,atrow=2) %beta(1)
   report(col=curr,atrow=3,special=parens) %stderrs(1)
   report(col=curr,atrow=4) %sigmasq
   report(col=curr,atrow=5) %logl
   set ustd = %resids/sqrt(%seesq)
   corr(print,qstats,number=15,method=yule) ustd
   report(col=curr,atrow=6) %qstat
   set usqr = ustd^2
   corr(print,qstats,number=15,method=yule) usqr
   report(col=curr,atrow=7) %qstat
   stats(noprint) %resids
   report(col=curr,atrow=8) %skewness
   report(col=curr,atrow=9) %kurtosis
end dofor s
report(action=format,picture="*.###",align=decimal)
report(action=show)
*
* Table 3
*
* The DOW series is 2 for Saturday,...,5 for Wednesday. Create dummies for
* each day of the week.
*
dec vect[series] dd(5)
do i=1,5
   set dd(i) = dow==(i)
end do i
*
*
* SKIP is a dummy variable which will be 1 if and only if there is a
* skipped weekday.
*
set(first=1.0) skip = .not.(dow{1}+1==dow.or.(dow{1}==5.and.dow==1))
*
nonlin(parmset=meanparms) b0 b1 b2 b3 b4 b5
nonlin(parmset=garchshifts) w1 w2 w3 w4 w5 w6 w7
nonlin(parmset=garchparms) alpha1 beta1 rnu
declare series uu h u
frml varf = alpha1*uu{1}+beta1*h{1}+$
   w1*dd(1)+w2*dd(2)+w3*dd(3)+w4*dd(4)+w5*dd(5)+w6*skip+w7*skip{1}-$
   (alpha1+beta1)*(w1*dd(1){1}+w2*dd(2){1}+w3*dd(3){1}+w4*dd(4){1}+w5*dd(5){1}+w6*skip{1}+w7*skip{2})
frml meanf = b0*dx{1}+b1*dd(1)+b2*dd(2)+b3*dd(3)+b4*dd(4)+b5*dd(5)
frml logl = (u=dx-meanf),(uu(t)=u^2),(h(t)=varf(t)),%logtdensity(h,u,1./rnu)
*
report(action=define,hlabels=||"","Stock"||)
dofor s = sto
   set dx = 100.0*(s{0}-s{1})
   *
   * Get preliminary guess values for the mean parameters off an OLS
   * regression on the mean dummies.
   *
   linreg(noprint) dx
   # dd dx{1}
   compute hinit=%seesq
   *
   set uu = %resids^2
   set h  = hinit
   set u  = %resids
   *
   * Poke those into the meanparms.
   *
   compute %parmspoke(meanparms,%beta)
   *
   * Get preliminary guess values for the dummy terms in the variance
   * equation by an OLS regression of the squared residuals on the
   * dummies.
   *
   linreg uu
   # dd skip skip{1}
   *
   * Poke those into the shift parameters
   *
   compute %parmspoke(garchshifts,%beta)
   *
   * Set initial guess values for the reciprocal degrees of freedom, and
   * the GARCH parameters.
   *
   compute rnu=.10
   compute alpha1=.1,beta1=.8
   *
   set test 4 * = logl(t)
   sstats 4 * .not.%valid(test)>>nmiss
   if nmiss>0.0
      compute w1=w2=w3=w4=w5=hinit,w6=w7=0.0
   maximize(parmset=meanparms+garchshifts+garchparms,pmethod=simplex,piter=10,method=bhhh) logl 4 *
   report(regressors,extra=stderrs)
   report(col=current,atrow=2*%nreg+2) %funcval
   stats u 4 *
   set ustd 4  * = u/sqrt(h)
   corr(print,qstats,number=15,method=yule) ustd 4 *
   report(col=current,atrow=2*%nreg+3) %qstat
   set usqr 4 * = ustd^2
   corr(print,qstats,number=15,method=yule) usqr 4 *
   report(col=current,atrow=2*%nreg+4) %qstat
   stats(noprint) ustd
   report(col=current,atrow=2*%nreg+5) %skewness
   report(col=current,atrow=2*%nreg+6) %kurtosis
   report(col=current,atrow=2*%nreg+7) 3.0*(1.0/rnu-2.0)/(1.0/rnu-4.0)
end dofor
report(action=show)
*
Attachments
Tom1.xlsx
(57.87 KiB) Downloaded 728 times
Post Reply