|
Statistics and Algorithms / Non-Linear Optimization / Gauss-Newton algorithm |
Gauss–Newton is the primary optimization method for the instructions NLLS and BOXJENK and is an option on DLM. An extension to multiple equations is used by NLSYSTEM. This method is specific to non-linear least squares estimation. It is an algorithm for solving problems like:
\(\mathop {\min }\limits_\theta \sum\limits_{t = 1}^T {u_t^2} \) where \({u_t} = f\left( {{y_t},{\kern 1pt} {\kern 1pt} {\kern 1pt} \theta } \right)\)
Gauss–Newton is based upon a linearization of the residuals around \({{\bf{\theta }}_k}\).
\({u_t}({\theta _k}) = - \left[ {\frac{{\partial {u_t}}}{{\partial \theta }}} \right]\left( {\theta - {\theta _k}} \right) + {u_t}(\theta )\)
This takes the form of a least squares regression of \({u_t}({\theta _k})\) on the partial derivatives and is the same as a hill-climbing step (after converting to a maximization) with
\({\bf{G}} = - {\left( {\sum\limits_{t = 1}^T {{{\left( {\frac{{\partial {u_t}}}{{\partial \theta }}} \right)}^\prime }\left( {\frac{{\partial {u_t}}}{{\partial \theta }}} \right)} } \right)^{ - 1}}\)
The objective function for instrumental variables (with Z as the instruments) is
\(\left( {\sum\limits_{t = 1}^T {{u_t}{Z_t}} } \right){\bf{W}}\left( {\sum\limits_{t = 1}^T {{{Z'}_t}{\kern 1pt} {u_t}} } \right)\)
where \({\bf{W}}\) is a weighting matrix, typically
\({\bf{W}} = {\left( {\sum {{{Z'}_t}{\kern 1pt} {Z_t}} } \right)^{ - 1}}\)
The same type of linearization gives
\({\bf{G}} = - {\left( {\left( {\sum\limits_{t = 1}^T {{{\left( {\frac{{\partial {u_t}}}{{\partial \theta }}} \right)}^\prime }{Z_t}} } \right){\bf{W}}\left( {\sum\limits_{t = 1}^T {{{Z'}_t}\left( {\frac{{\partial {u_t}}}{{\partial \theta }}} \right)} } \right)} \right)^{ - 1}}\)
Copyright © 2026 Thomas A. Doan