|
Examples / NLLS.RPF |
NLLS.RPF is an example of non-linear least squares. It is adapted from Examples 6.7 and 6.8 from Martin, Hurn and Harris(2013).
The model being estimated is
\(C = \beta _0 + \beta _1 YD^{\beta _2 } \)
This has obvious guess values, since with \(\beta _2 = 1\), the other coefficients can be estimated by least squares. The NONLIN and FRML describe the parameter set and the model (given the chosen names for the parameters). The LINREG estimates the restricted model and copies the estimated coefficients into the parameters. NLLS then estimates the non-linear model:
nonlin beta0 beta1 beta2
frml nlconst cons = beta0+beta1*inc^beta2
linreg cons
# constant inc
compute beta0=%beta(1),beta1=%beta(2),beta2=1.0
nlls(frml=nlconst)
The example program then does tests for linearity (that is \(\beta_2=1\)) in two ways. First, with a "Wald" test which operates on the results of the NLLS:
test(title="Test of linearity")
# 3
# 1.0
A likelihood ratio test can be constructed easily since NLLS and LINREG both compute the log likelihood and puts it into %LOGL. Save the %LOGL from one model and use CDF (with degrees of freedom=1) to do the test:
compute loglunr=%logl
*
linreg cons
# constant inc
*
cdf(title="LR Test of linearity") chisqr 2.0*(loglunr-%logl) 1
Full Program
open data usdata.txt
calendar(q) 1960:1
data(format=prn,nolabels,org=columns) 1960:01 2009:04 cons inc
*
nonlin beta0 beta1 beta2
frml nlconst cons = beta0+beta1*inc^beta2
*
linreg cons
# constant inc
compute beta0=%beta(1),beta1=%beta(2),beta2=1.0
nlls(frml=nlconst)
*
* Wald test for beta2=1
*
test(title="Test of linearity")
# 3
# 1.0
*
* Likelihood ratio test
*
compute loglunr=%logl
*
linreg cons
# constant inc
*
cdf(title="LR Test of linearity") chisqr 2.0*(loglunr-%logl) 1
*
* LM test
* Generate the derivative with respect to beta2 at beta2=1. The other derivatives
* are just the two explanatory variables.
*
set dbeta2 = beta1*inc*log(inc)
*
* Generate an LM (outer-product-of-gradient) statistic
*
mcov(opgstat=lmstat) / %resids
# constant inc dbeta2
cdf(title="LM test of linearity") chisqr lmstat 1
Output
This is the preliminary least squares to get guess values:
Linear Regression - Estimation by Least Squares
Dependent Variable CONS
Quarterly Data From 1960:01 To 2009:04
Usable Observations 200
Degrees of Freedom 198
Centered R^2 0.9981108
R-Bar^2 0.9981013
Uncentered R^2 0.9996600
Mean of Dependent Variable 4906.7400000
Std Error of Dependent Variable 2304.4552354
Standard Error of Estimate 100.4149874
Sum of Squared Residuals 1996467.5986
Regression F(1,198) 104609.5461
Significance Level of F 0.0000000
Log Likelihood -1204.6450
Durbin-Watson Statistic 0.2516
Variable Coeff Std Error T-Stat Signif
************************************************************************************
1. Constant -228.5400745 17.3927175 -13.13999 0.00000000
2. INC 0.9498154 0.0029367 323.43399 0.00000000
And these are the estimates of the non-linear model:
Nonlinear Least Squares - Estimation by Gauss-Newton
Convergence in 39 Iterations. Final criterion was 0.0000000 <= 0.0000100
Dependent Variable CONS
Quarterly Data From 1960:01 To 2009:04
Usable Observations 200
Degrees of Freedom 197
Centered R^2 0.9987629
R-Bar^2 0.9987503
Uncentered R^2 0.9997774
Mean of Dependent Variable 4906.7400000
Std Error of Dependent Variable 2304.4552354
Standard Error of Estimate 81.4634087
Sum of Squared Residuals 1307348.5306
Log Likelihood -1162.3071
Durbin-Watson Statistic 0.4081
Variable Coeff Std Error T-Stat Signif
************************************************************************************
1. BETA0 299.01976924 0.59968767 498.62584 0.00000000
2. BETA1 0.28931296 0.00042143 686.49755 0.00000000
3. BETA2 1.12408729 0.00015266 7363.35287 0.00000000
This is the output from the Wald test:
Test of linearity
Chi-Squared(1)= 660702.158537 with Significance Level 0.00000000
This is the re-do of the linear model:
Linear Regression - Estimation by Least Squares
Dependent Variable CONS
Quarterly Data From 1960:01 To 2009:04
Usable Observations 200
Degrees of Freedom 198
Centered R^2 0.9981108
R-Bar^2 0.9981013
Uncentered R^2 0.9996600
Mean of Dependent Variable 4906.7400000
Std Error of Dependent Variable 2304.4552354
Standard Error of Estimate 100.4149874
Sum of Squared Residuals 1996467.5986
Regression F(1,198) 104609.5461
Significance Level of F 0.0000000
Log Likelihood -1204.6450
Durbin-Watson Statistic 0.2516
Variable Coeff Std Error T-Stat Signif
************************************************************************************
1. Constant -228.5400745 17.3927175 -13.13999 0.00000000
2. INC 0.9498154 0.0029367 323.43399 0.00000000
and this is the output from the likelihood ratio test for linearity. While quite different numerically from the Wald test, both are overwhelming rejections.
LR Test of linearity
Chi-Squared(1)= 84.675671 with Significance Level 0.00000000
LM test of linearity
Chi-Squared(1)= 53.859614 with Significance Level 0.00000000
Copyright © 2026 Thomas A. Doan