*
* ForcedFactor computes a factorization of a covariance matrix which includes
* a (scale of a) specified column in the factorization, or, optionally, a scale
* of a specified row in the inverse of the factorization. This allows you to
* either force an orthogonalized component to hit the variables in a specific pattern
* (done by setting a column of the factorization), or to force that an orthogonalized
* component be formed from a particular linear combination of innovations (forces a row
* in the inverse). You set the one component, and ForcedFactor builds a factorization
* around it. This is useful if you want to do error decompositions or anything else that
* requires a full orthogonalization, but don't wish to build a full model.
*
* Syntax:
*
* @ForcedFactor( options ) sigma a f
*
* Parameters:
* sigma SYMMETRIC covariance matrix (input)
* a VECTOR - should be same dimension as sigma (input)
* f RECTANGULAR - output factor of sigma
*
* Options:
* force=[row]/column indicates whether a scale of "a" is to be a row in the
* inverse or a column in the factorization itself.
*
* Examples:
*
* In a four variable system where the first two variables are two interest rates,
*
* @ForcedFactor sigma ||1.0,-1.0,0.0,0.0|| f1
* @ForcedFactor(force=column) sigma ||1.0,1.0,0.0,0.0|| f2
*
* f1 will be a factorization where the first orthogonal component is the
* (non-orthogonal) innovation in the difference between the rates. f2 will be
* a factorization where the first orthogonal component loads equally onto the
* two interest rates, and hits none of the other variables contemporaneously.
*
* Revision Schedule
* 09/02 Written by Estima
*
procedure ForcedFactor sigma a f
*
type symm sigma
type vector a
type rect *f
*
option choice force 1 row column
*
local integer i j n
local rect p b
local vector h bi bj gi
local real hh dd
local real bb
*
compute n=%rows(sigma)
*
* The idea behind this is to start with a factorization of sigma into PP'.
* Here we choose the one based upon eigenvectors. Compute a Gram-Schmidt
* orthonormalization of any set of linearly independent vectors, but
* with the first column replaced by a properly chosen transformation of
* the input vector. (In this case, we use the columns of P as the basis,
* mainly because, in practice, it should avoid any problems with the forced
* column being linearly dependent with the others). Premultiplying the
* orthonormal matrix by P gives the desired factor, when P**-1 x a is used
* as the replacement first column. Fixing a row in the inverse is similar.
*
eigen(scale) sigma * p
*
if force==1
compute h=tr(p)*a
else
compute h=inv(p)*a
*
compute hh=sqrt(%dot(h,h))
*
dim b(n,n)
compute b=%const(0.0)
*
do i=1,n
compute b(i,1)=h(i)/hh
end do i
*
do j=2,n
compute bj=%xcol(p,j)
compute bi=bj
do i=1,j-1
compute gi=%xcol(b,i)
compute dd=%dot(bj,gi)
compute bi=bi-dd*gi
end do i
compute bb=sqrt(%dot(bi,bi))
do i=1,n
compute b(i,j)=bi(i)/bb
end do i
end do j
if force==1
compute f=p*tr(inv(b))
else
compute f=p*b
end