RATS 10.1
RATS 10.1

BHHH is a hill-climbing algorithm which implements the proposal for choosing G recommended in Berndt, Hall, Hall and Hausman (1974). Because it can only be applied to specific types of optimization problems, the only RATS instructions which can use it are MAXIMIZE and GARCH. The function being maximized has the form:

 

\(F\left( \theta  \right) = \sum\limits_{t = 1}^T {f\left( {{y_t},{\kern 1pt} \theta } \right)} \)

 

where \(F\) must be the actual log likelihood function (possibly omitting additive constants. It chooses \(G\) as \(J^{-1}\) where


\({\bf{J}} = \sum\limits_{t = 1}^T {\left[ {\frac{{\partial f}}{{\partial \theta }}{{\left( {{y_t},{\kern 1pt} \theta } \right)}^\prime }{\kern 1pt} {\kern 1pt} {\kern 1pt} \frac{{\partial f}}{{\partial \theta }}\left( {{y_t},{\kern 1pt} \theta } \right)} \right]} \)


Under fairly general conditions, \(-\bf{J}\) will have the same asymptotic limit (when divided by \(T\)) as the Hessian \(\bf{H}\). The information equality is used in deriving this, which is why \(F\) is required to be the log likelihood.

 

If the function is not the log likelihood, the algorithm will still usually end up at the correct maximum, as a good choice of \(\bf{G}\) generally affects only how quickly the hill is climbed. Convergence, however, is likely to be slow, and the standard errors and covariance matrix will be incorrect.
 


Copyright © 2025 Thomas A. Doan