Page 1 of 1

RegWhiteNNTest—White Neural Network linearity test

Posted: Wed Jul 02, 2008 1:22 pm
by TomDoan
This is a regression post-processor to do the White "neural network" test. This tests for a correlation between the regression residuals and randomly chosen non-linear functions of the explanatory variables. To get good results, you will generally use a small number of principal components (2 or 3) from a larger number of hidden nodes (10 to 20), since some of the randomly chosen hidden nodes will have very little variance.

This does not exclude the first principal component as (seems to be) described in Lee, White and Granger. Since the hidden nodes are all chosen randomly, there is no particular reason to assume that the largest PC will have any relationship to the fitted values of the regression.

Code: Select all

*
* @RegWhiteNNTest(options)
* Regression post-processor to do the White Neural Network test.
*
* Options:
*   [PRINT]/NOPRINT
*   HIDDEN=number of (randomized) hidden nodes[2]
*   PC=number of principal components of the hidden nodes [all]
*   NREPS=number of repetitions [1]
*
* References:
*   Lee, White and Granger(1992), "Testing for Neglected Non-linearities in Time Series
*    Models," J. of Econometrics, vol 56, 269-290.
*
* Revision Schedule:
*   06/2008 Written by Tom Doan, Estima
*
* Variables Defined:
*   %CDSTAT and %SIGNIF are the LM version of the test statistic and its significance level.
*
procedure RegWhiteNNTest y start end
type series y
type integer start end
*
option switch  print  1
option integer hidden 2
option integer pc
option integer nreps  1
*
local integer  i j startl endl h pch xc reps
local vect     maxx minx gamma eigval
local vect[series] z
local series   u
local symm     zxx
local rect     vhat tt eigvect
local real     rssr rssu fstat chisqr
*
dim maxx(%nreg) minx(%nreg) gamma(%nreg)
compute startl=%regstart(),endl=%regend()
set u startl endl = %resids
*
* Compute the max and min for each regressor
*
do i=1,%nreg
   sstats(max) %regstart() %regend() %eqnxvector(0,t)(i)>>maxx(i)
   sstats(min) %regstart() %regend() %eqnxvector(0,t)(i)>>minx(i)
   if maxx(i)==minx(i)
      comp minx(i)=0.0
end do i
*
compute h=hidden
if %defined(pc)
   compute pch=pc
else
   compute pch=h
*
dim z(h)
do reps=1,nreps
   *
   * Generate the squashed randomized output from the hidden nodes
   *
   do j=1,h
      compute gamma=%uniform(-2.0,2.0)
      set z(j) startl endl = 0.0
      do i=1,%nreg
         set z(j) startl endl = z(j)+gamma(i)*(%eqnxvector(0,t)(i)-minx(i))/(maxx(i)-minx(i))
      end do i
      set z(j) startl endl = %logistic(z(j),1.0)
   end do j
   *
   * Compute the cross product of the outputs, the original regressors and the
   * regression residuals
   *
   cmom startl endl
   # z %rlfromtable(%eqntable(0)) u
   *
*  compute vhat=%sweeplist(%cmom,%seq(h+1,h+%nreg))
   compute vhat=%cmom
   if pch<h {
      compute zxx=%xsubmat(vhat,1,h,1,h)
      eigen zxx eigval eigvect
      compute tt=%xsubmat(eigvect,1,h,1,pch)~\%identity(%nreg+1)
      compute vhat=%mqform(%cmom,tt)
      compute xc=pch
   }
   else {
      compute vhat=%cmom
      compute xc=h
   }
   *
   compute rssr=vhat(%nreg+xc+1,%nreg+xc+1)
   compute vhat=%sweeptop(vhat,%nreg+xc)
   compute rssu=vhat(%nreg+xc+1,%nreg+xc+1)
   compute fstat=(rssr-rssu)*(%ndf-xc)/(xc*rssu)
   compute chisqr=%nobs*(1.0-rssu/rssr)
   *
   if print {
      disp "White Neural Network Test"
      disp "Using "+pc+" principal components from "+h
      disp "F("+xc+","+(%ndf-xc)+")=" @13 #####.### fstat @25 "Significance Level" #.####### %ftest(fstat,xc,%ndf-xc)
      disp "LM("+xc+")=" @13 #####.### chisqr @25 "Significance Level" #.####### %chisqr(chisqr,xc)
   }
   *
   compute %cdstat=chisqr,%signif=%chisqr(%cdstat,xc)
end do reps
end