RATS 10.1
RATS 10.1

Examples /

BOOTCOINTEGRATION.RPF

Home Page

← Previous Next →

BOOTCOINTEGRATION.RPF is an example of bootstrapping an FMOLS estimate of cointegrating vectors. It is based upon the technique described in Li and Maddala(1997).

 

The presumed cointegrating relationship is a model demand function: real money on real GNP and interest rate. Fuilly modified least squares first estimates the relationship by least squares, then does a non-parametric correction for serial correlation and simultaneity to produce an estimate of the cointegrating vector. A block bootstrap is used because there is no model for the serial correlation correction. The first differences and residuals are bootstrapped together to allow data to be rebuilt.

 

Note that, in practice, you would want to first test the three variables for unit roots. That is not included in this.

 

This does the FM estimates using a 5 lag non-parametric correction for serial correlation. The @FM procedure treats the first variable listed as the dependent variable).

 

@fm(lags=5,det=constant) 1954:1 *
# mp y r


This gets the FM residuals and the estimated coefficient vector. Depending upon how you're planning to use the bootstrap, you might want to make B equal to something else (like the value the cointegrating vector would take under a null hypothesis). B is used in building the bootstrapped data.

 

set u = %resids
compute b=%beta
 

We need the differenced right hand side variables.

 

diff y / dy
diff r / dr
 

This initializes the accumulators for computing the bootstrapped estimates of the covariance matrix.
 

compute [vector] betamean=%zeros(3,1)
compute [symm]   betacmom=%zeros(3,3)
 

Inside the loop, this draws random entries using a stationary block bootstrap of expected size 10. (In the stationary block bootstrap the blocks itself are a random size with expected size 10).

 

boot(block=10,method=stationary) entries 1954:1 *
  

This builds the bootstrapped data. The right hand side variables (Y and R) are rebuilt as random walks with mildly serially correlated shocks due to the block sampling. The dependent variable is rebuilt using the estimated cointegrated vector with the added residuals (again, mildly serially correlated).

 

set(first=0.0) resampy 1953:4 1988:4 = resampy{1}+dy(entries(t))
set(first=0.0) resampr 1953:4 1988:4 = resampr{1}+dr(entries(t))
set resampmp 1954:1 1988:4 = b(1)+b(2)*resampy+b(3)*resampr+u(entries(t))
 

Reestimate the model on the resampled data and update the statistics. (The BETAMEAN is estimating the difference between the estimate and the original value, which is not used in the rest of the program).

 

@fm(lags=5,det=constant,noprint) 1954:1 *
# resampmp resampy resampr
compute betamean=betamean+%beta-b
compute betacmom=betacmom+%outerxx(%beta-b)
 

Outside the loop, this converts the estimates and displays:

 

compute betacmom=betacmom/ndraws
linreg(create,coeffs=b,covmat=betacmom,noscale,$
   title="Fully Modified LS with Bootstrapped Standard Errors") mp
# constant y r

Full Program

open data kpswdata.rat
calendar(q) 1947
all 1988:04
data(format=rats) / c in y mp dp r
*
@fm(lags=5,det=constant) 1954:1 *
# mp y r
*
* Get the FM residuals and the estimated coefficient vector. Depending
* upon how you're planning to use the bootstrap, you might want to make
* b equal to something else (like the value the cointegrating vector
* would take under a null hypothesis).
*
set u = %resids
compute b=%beta
*
* Difference the RHS endogenous variables
*
diff y / dy
diff r / dr
*

* Initialize a VECTOR and a SYMM to accumulate the estimates and covariance matrix.

*

compute [vector] betamean=%zeros(3,1)
compute [symm]   betacmom=%zeros(3,3)
*
compute ndraws=1000
do draws=1,ndraws
   *
   * Do a block bootstrap for the stationary elements.
   *
   boot(block=10,method=stationary) entries 1954:1 *
   *
   * The first differences and the residuals are resampled together.
   * Rebuild the RHS endogenous variables, and then the dependent
   * variable
   *
   set(first=0.0) resampy 1953:4 1988:4 = resampy{1}+dy(entries(t))
   set(first=0.0) resampr 1953:4 1988:4 = resampr{1}+dr(entries(t))
   set resampmp 1954:1 1988:4 = b(1)+b(2)*resampy+b(3)*resampr+u(entries(t))
   *
   * Re-estimate and update the betamean and betacmom matrices
   *
   @fm(lags=5,det=constant,noprint) 1954:1 *
   # resampmp resampy resampr
   compute betamean=betamean+%beta-b
   compute betacmom=betacmom+%outerxx(%beta-b)
end do draws
*
* Convert the information to estimates and display.
*
compute betacmom=betacmom/ndraws
linreg(create,coeffs=b,covmat=betacmom,noscale,$
   title="Fully Modified LS with Bootstrapped Standard Errors") mp
# constant y r
 

Output

Because the second output is based upon bootstrapping, the standard errors won't match exactly when run a second time.
 

Linear Regression - Estimation by Fully Modified LS

Dependent Variable MP

Quarterly Data From 1954:02 To 1988:04

Usable Observations                       139

Degrees of Freedom                        136

Mean of Dependent Variable       -4.590196475

Std Error of Dependent Variable   0.167409973

Standard Error of Estimate        0.024460270

Sum of Squared Residuals         0.0813694560

Durbin-Watson Statistic                0.3465

 

    Variable                        Coeff      Std Error      T-Stat      Signif

************************************************************************************

1.  Constant                      0.564622231  0.151298455      3.73184  0.00019008

2.  Y                             1.177185493  0.033227206     35.42836  0.00000000

3.  R                            -0.012749079  0.001801586     -7.07659  0.00000000

 

 

Linear Regression - Estimation by Fully Modified LS with Bootstrapped Standard Errors

Dependent Variable MP

Quarterly Data From 1947:01 To 1988:04

Usable Observations                       168

Degrees of Freedom                        165

Mean of Dependent Variable       -4.635467482

Std Error of Dependent Variable   0.182266548

Standard Error of Estimate        0.066633160

Sum of Squared Residuals         0.7325963764

Durbin-Watson Statistic                0.0627

 

    Variable                        Coeff      Std Error      T-Stat      Signif

************************************************************************************

1.  Constant                      0.564622231  0.010403335     54.27319  0.00000000

2.  Y                             1.177185493  0.049219965     23.91683  0.00000000

3.  R                            -0.012749079  0.001984871     -6.42313  0.00000000


 


Copyright © 2025 Thomas A. Doan