Blanchard Quah AER 1989 |
This is a replication for Blanchard and Quah(1989), which introduced the Blanchard-Quah structural model for identification of shocks by long-run restrictions.
This has two programs:
BQEXAMPLE.RPF
This includes everything from the paper except the figures and tables that require simulations. This includes impulse response functions, historical decomposition and calculation of structural residuals.
BQMONTE.RPF
This includes the figures and tables that require simulations. Note that BQ used bootstrapping, which did not work well in this case. This is a point made in Sims and Zha(1999). Instead these are done by Monte Carlo integration.
Data Preparation
The paper uses U.S. real GNP and the unemployment rate. To apply the long-run zero restriction, GNP enters the VAR in log differenced form. Real GNP is actually created from nominal GNP (GNP series on the data file) and the GDP deflator (GD87). The log(100) is to correct for the deflator being indexed to 100. That washes out of the growth rate, but the log level is used later in the program.
cal(q) 1948
open data bqdata.xls
data(format=xls,org=cols) 1948:1 1987:4 gnp gd87 lhmur
*
set loggnp = log(gnp)
set loggd = log(gd87)
set logrgnp = loggnp-loggd+log(100)
set dlogrgnp = 100.0*(logrgnp-logrgnp{1})
Separate means are then removed from the GNP growth series, with a break at 1973:4 and the unemployment rate is linearly detrended. The GNP adjustment would be based upon the notion that there was a change in the GDP growth rate that is due to factors outside the model. The unemployment adjustment is almost certainly unreasonable over a long time period. The VAR is run on the adjusted data, but the adjustments themselves are saved (into MEANS_FROM_GNP and TREND_FROM_UR) to be used later.
set dummy1 = t<=1973:4
set dummy2 = t>1973:4
linreg dlogrgnp
# dummy1 dummy2
set gnpadjust = %resids
prj means_from_gnp
*
* Remove a linear trend from unemployment
*
filter(remove=trend) lhmur / uradjust
set trend_from_ur = lhmur-uradjust
BQEXAMPLE.RPF
The VAR on the adjusted data is run with 8 lags. The residuals are saved for later transformation into structural shocks.
system(model=bqmodel)
var gnpadjust uradjust
lags 1 to 8
det constant
end(system)
*
estimate(noprint,resids=resids)
This uses the %VARLAGSUMS from the VAR to construct the Blanchard-Quah factor of the covariance matrix %SIGMA.
By construction, this will have a long run loading of zero from the second shock ("demand") to the first dependent variable (GNP).
compute factor=%bqfactor(%sigma,%varlagsums)
The zero restriction only determines the shape, not the sign, of the demand shock. If we want it defined so the contemporaneous response of output (series #1) is positive for both shocks, we need to flip signs if the computed impact has the wrong sign. (%BQFACTOR does a Cholesky factor on a transformed matrix which is then back-transformed. The Cholesky factor itself has a positive (2,2) diagonal element, but that could change after the back-transformation).
@ImpactSignFlip factor ||+1,+1||
This uses the @VARIRF procedure to do figures 1 and 2 from the paper. Responses of output (1st variable) are accumulated so they will be responses of 100 x log(GNP) rather than GNP growth rates.
@varirf(model=bqmodel,factor=factor,steps=40,page=byshocks,$
variables=||"Output","Unemployment"||,shocks=||"Supply","Demand"||,accum=||1||)
The next (rather lengthy) section of the program does the historical decomposition. The HISTORY instruction is able to decompose the data from the VAR, but that is the de-meaned GNP growth rates and de-trended unemployment rate. The desire is to present a decomposition of the observed (log) GNP and unemployment rate.
history(model=bqmodel,factor=factor,base=base,effects=effects)
First, put the removed means back into the GNP growth, and the removed trend back into unemployment.
set base(1) = base(1)+means_from_gnp
set base(2) = base(2)+trend_from_ur
Next, accumulate the GNP components (the BASE(1), and the EFFECTS that hit variable 1) and scale them back by .01. (To reverse the 100 x scaling done to the growth rates, which was done to make the impulse responses easier to read).
dofor s = base(1) effects(1,1) effects(1,2)
acc s
set s = s{0}*.01
end dofor s
Finally, since converting to growth rates lost the level of the data, add back into the forecast component the final value of log GNP before the forecast period. (The VAR esitmation starts in 1950:2 because of 8 lags plus one data point lost to differencing).
set base(1) = base(1)+logrgnp(1950:1)
This computes the counterfactual of GNP without the demand shocks, that is, the base plus the supply shocks:
set lessdemand = base(1)+effects(1,1)
graph(footer="Figure 7. Output Fluctuations Absent Demand")
# lessdemand
The effects of the demand shocks (cumulatively on GNP) is shown in their Figure 8. This uses "spikes" up at the NBER peaks and down at the NBER troughs. (A more common, and simpler, way to highlight these is to shade the recessions). To create the effect in their Figure 8, it's necessary to scale the "peaks" and "troughs" dummies by a value slightly in excess of the scale of the data.
@NBERCycles(peaks=peaks,trough=troughs)
set peaks = .10*peaks
set troughs = -.10*troughs
*
graph(footer="Figure 8. Output Fluctuations Due to Demand",$
ovcount=2,overlay=spike,ovsame) 3
# effects(1,2)
# peaks 1950:2 * 2
# troughs 1950:2 * 2
The same pair of figures is done for unemployment.
set lessdemand = base(2)+effects(2,1)
graph(footer="Figure 9. Unemployment Fluctuations Absent Demand",min=0.0)
# lessdemand
*
@NBERCycles(peaks=peaks,trough=troughs)
set peaks = 4.0*peaks
set troughs = -4.0*troughs
*
graph(footer="Figure 10. Unemployment Fluctuations Due to Demand",$
ovcount=2,overlay=spike,ovsame) 3
# effects(2,2)
# peaks 1950:2 * 2
# troughs 1950:2 * 2
Their Table 1 displays structural residuals, which are the model's interpretation of supply shocks and demand shocks. This is done using the @STRUCTRESIDS procedure, using the B-Q factor computed above to make the transformation:
@StructResids(factor=factor) resids / sresids
There was a slight difference between the data set used in producing the paper and the data set used here, and this can be seen here where actual numbers (rather than just graphs) are displayed. This lists the shocks during periods around several recessions:
report(action=define)
report(atrow=1,atcol=1,align=center) "Quarter" "Demand(Percent)" "Supply(Percent)"
do time=1973:3,1975:1
report(row=new,atcol=1) %datelabel(time) sresids(2)(time) sresids(1)(time)
end do time
report(row=new)
do time=1979:1,1980:2
report(row=new,atcol=1) %datelabel(time) sresids(2)(time) sresids(1)(time)
end do time
report(action=format,picture="*.#")
report(action=show)
BQMONTE.RPF
This includes the figures and tables that require simulations. These are done using standard Monte Carlo simulations rather than bootstrapping. The data are prepared and the VAR estimated as in the BQEXAMPLE.RPF program.
The simulations are done using the @BQDODRAWS procedure. This is a Monte Carlo draw procedure which is specific to the two variable Blanchard-Quah structural VAR. It (by default) accumulates the responses of the first variable, since that is (in the BQ model) included in differenced form. The post-processing of the draws is done using @MCGRAPHIRF (for impulse responses) and @MCFEVDTABLE (for variance decompositions).
@BQDoDraws(model=varmodel)
@MCGraphIRF(model=varmodel,shocklabels=shocklabels,varlabels=varlabels)
*
disp "Table 2-Variance Decomposition. Output growth break; Unemployment Detrended"
source mcfevdtable.src
@MCFEVDTable(model=varmodel,horizons=||1,2,3,4,8,12,40||,$
varlabels=varlabels,shocklabels=shocklabels,table=byshocks)
The remainder of the paper does a sensitivity analysis for different combinations of the data de-meaning and de-trending.
Copyright © 2025 Thomas A. Doan