Cecchetti Rich JBES 2001 |
This replicates the three structural VAR models used in Cecchetti and Rich(2001). Each of these are SVAR's of various sizes with short- and long-run restrictions. They are used to compute the "sacrifice ratio" of GDP lost to disinflationary policy. By using bootstrapping, they show that the reasonable-looking point estimates for the sacrifice ratio (at the point estimates) don't survive when you take into account the uncertainty in the dynamics of the model. Note well that the whole point of the paper is to show that the sacrifice ratio isn't well-estimated and (as such) shouldn't really be a statistic of interest.
This is a good example of the use of bootstrapping for an SVAR, since the bootstrapping maintains the contemporaneous relationships among residuals. As a result, you can just do a standard bootstrap and re-estimate the SVAR with each draw. The @VARBootSetup and @VARBootDraw procedures are used for doing the bootstrapped data.
This has three separate programs:
SVAR2.RPF
is a two variable model with GDP growth and the change in the inflation rate. A Blanchard-Quah type factor is used with the monetary policy (disinflationary) shock constrained to have a zero long-run effect on GDP.
SVAR3.RPF
is a three variable model with GDP growth, change in the inflation rate and the real interest rate. The SVAR estimates aggregate supply, monetary (disinflationary) and aggregate demand shocks, where the last two have zero long-run effect on GDP (with a short-run constraint used to differentiate those). This uses @ShortAndLong combined with @ImpactSignFlip to estimate components with the desired sign conventions.
SVAR4.RPF
is a four variable model with GDP growth, change in the T-bill rate, real interest rate and real money growth and identifies aggregate supply, money supply, money demand and IS shocks. This is a more complicated model which uses a combination of short- and long-run restrictions and an additional "A" model type restriction for constructing the monetary supply shock. It requires using both @ShortAndLong with a model that doesn't fully identify the factorization (only providing a mapping from free parameters to the partially-identified factor) with CVMODEL used to do the final estimation. This takes quite a bit longer than the other two because it requires a non-linear estimation for each bootstrap draw.
Common Structure
All three programs have a common structure:
1.Estimate the VAR, saving the residuals for use in bootstrapping
2.Estimate or compute the Structural VAR (SVAR) being used.
3.Compute the point estimate of the "sacrifice ratio" implied by the estimates
4.Set up for bootstrap draws
5.Loop over bootstrap draws, generating bootstrapped data
6.Inside the bootstrap loop, re-estimate the VAR and SVAR; compute and save the boostrapped sacrifice ratios
7.Outside the bootstrap loop, do a density graph and compute statistics on the bootstrapped sacrifice ratios
They also include graphs of the Impulse Response Functions (IRF's) with bootstrapped confidence intervals, though those aren't really necessary for the calculation of the sacrifice ratios.
Sacrifice Ratio
The "sacrifice ratio" is defined to be the aggregate cost in lost GDP growth to achieve a given level of reduction in inflation. (Note: a reduction in the inflation rate, not in the price level). This is computed over a chosen time span (20 quarters in this paper). How it is computed will depend upon the model. First, it's necessary to determine which shock in the structural VAR represents aggregate demand or fiscal policy. Given the IRF's to that shock, the total change to log GDP over the desired horizon needs to be computed, as well as the response at that horizon to the inflation rate. For instance, the following is the calculation from SVAR2.RPF. The aggregate demand shock is #2, variable #1 is the log growth of GDP, and variable #2 is the first difference of the inflation rate. Both inflation and GDP growth are computed as 100 x the log difference, thus, quarterly percentage changes, not the more commonly quoted annualized growth rates. The IRF's are accumulated to give the response of GDP, and the response of the inflation rate. The response of GDP's are then summed (using the SSTATS instruction) to get the total change in GDP. (Given the impact signs in the model, this will be negative, as will the response of inflation). The ratio of the GDP sum to the response of inflation provides the estimate of the sacrifice ratio.
impulse(factor=factor,steps=nsteps,model=var2,results=irf,noprint)
acc irf(1,2)
acc irf(2,2)
*
sstats 1 nsteps irf(1,2)>>cumgdp
disp "Sacrifice ratio(Point Estimate)" cumgdp/irf(2,2)(nsteps)
Bootstrap Loop
This does NBOOT draws of the model (in these models 10000). Outside the loop, @VARBootSetup is used to create the parallel bootstrap VAR (called here BOOTVAR). The original estimation range is saved into BSTART and BEND, which is what will be the range used for doing the bootstrap draws:
@VARBootSetup(model=var2) bootvar
*
compute bstart=%regstart(),bend=%regend()
The following is the standard setup for saving the generated impulse responses:
dec vect[rect] %%responses(nboot)
Inside the loop, this uses @VARBootDraw to draw bootstrapped data (using the observed pre-sample values).
*
* Draw the new data
*
@VARBootDraw(model=var2,resids=resids) bstart bend
*
* Estimate the VAR on the bootstrapped data
*
estimate(model=bootvar,noprint) bstart bend
The structural VAR is estimated (or computed) given the covariance matrix from the bootstrapped estimates for the VAR lag model and the sacrifice ratio is computed using the appropriate analysis for the model. On the IMPULSE instruction, this uses the FLATTEN option to save the IRF's for latter analysis. The sacrifice ratios are saved into elements of a SERIES (called SACRATIOS) with length NBOOT. For instance, for SVAR2.RPF:
compute sacratios(boot)=cumgdp/irf(2,2)(nsteps)
Post-Loop Processing
The bootstrapped estimates of the sacrifice ratios are in a series of length NBOOT called SACRATIOS. This first uses @GridSeries to generate a grid of values (SERIES called XSR) from -100 to 100 for use in doing the density estimates of the sacrifice ratios. DENSITY does the density estimation, using a fairly heavily smoothed estimator, with the density estimates in the series FSR. This also does a STATISTICS(FRACTILES,..) instruction to get the percentiles of the estimates.
@gridseries(from=-100,to=+100,n=400) xsr
density(smoothing=3.0,grid=input) sacratios 1 nboot xsr fsr
scatter(style=lines,header="Frequency Distribution for the Cecchetti Model")
# xsr fsr
*
stats(fractiles,title="Ceccheti sacrifice ratios") sacratios 1 nboot
Finally, it uses @MCProcessIRF to generate and then graph the responses of GDP and inflation with confidence bands: here 90%. Whether the responses need to be accumulated (ACCUM option) will depend upon the variables that go into the data.
@mcprocessirf(model=var2,percentiles=||.05,.95||,center=median,$
lower=lower,upper=upper,irf=irf,accum=||1,2||)
spgraph(hfields=2)
graph(header="Real GDP-Ceccheti",number=0) 3
# irf(1,2)
# lower(1,2) / 2
# upper(1,2) / 2
graph(header="Inflation-Ceccheti",number=0) 3
# irf(2,2)
# lower(2,2) / 2
# upper(2,2) / 2
spgraph(done)
Copyright © 2025 Thomas A. Doan