Singular Value Decomposition
-
Ken-Cogger
- Posts: 13
- Joined: Wed Feb 15, 2012 7:58 pm
Singular Value Decomposition
%svdecomp(A) decomposes matrix A into A=U*D(w)*V'.
Is there a simple way in RATS to replace the elements of D by their
reciprocals, but by 0.0 if an element w is close to zero?
Or is the only way a do loop?
Thanks,
Ken.
Is there a simple way in RATS to replace the elements of D by their
reciprocals, but by 0.0 if an element w is close to zero?
Or is the only way a do loop?
Thanks,
Ken.
Re: Singular Value Decomposition
Something like that since you need to test for machine zero:
ewise eigval(i)=%if(eigval(i)<=1.e-12,0.0,1.0/eigval(i))
ewise eigval(i)=%if(eigval(i)<=1.e-12,0.0,1.0/eigval(i))
-
Ken-Cogger
- Posts: 13
- Joined: Wed Feb 15, 2012 7:58 pm
Re: Singular Value Decomposition
Thank you.
However, I may not need to use ewise:
In OLS, the normal regression equations are X'X*Beta=X'y
In one of my test cases, X'X (4 x 4) is perfectly singular, but Beta is returned by linreg with no reported error.
(|X'X| is reported as -3.3E-10, which is consistent with singularity)
The SVD of X'X reports a diaganol element of ~7.5E-14 in svd(2), again consistent with singularity.
RATS reports OLS estimates the same as simply setting the corresponding coefficient equal to zero.
The SVD solution, setting the inverse of the offending element =0 and all others to 1/svd(2) gives another solution.
The RATS and SVD solutions both give the same sum of squared errors, but, of course,
the SVD solution gives the minimum possible ||Beta||^2 magnitude, in accordance with theory.
Is there a reason for the RATS choice over the SVD choice of Beta solutions?
Thanks,
Ken.
However, I may not need to use ewise:
In OLS, the normal regression equations are X'X*Beta=X'y
In one of my test cases, X'X (4 x 4) is perfectly singular, but Beta is returned by linreg with no reported error.
(|X'X| is reported as -3.3E-10, which is consistent with singularity)
The SVD of X'X reports a diaganol element of ~7.5E-14 in svd(2), again consistent with singularity.
RATS reports OLS estimates the same as simply setting the corresponding coefficient equal to zero.
The SVD solution, setting the inverse of the offending element =0 and all others to 1/svd(2) gives another solution.
The RATS and SVD solutions both give the same sum of squared errors, but, of course,
the SVD solution gives the minimum possible ||Beta||^2 magnitude, in accordance with theory.
Is there a reason for the RATS choice over the SVD choice of Beta solutions?
Thanks,
Ken.
Re: Singular Value Decomposition
Because most of the time that X'X is singular, it's a mistake. The assignment of a zero coefficient/standard error to the coefficient at which the singularity is detected during Cholesky factorization makes that more obvious. An SVD handling makes all the coefficients non-zero with a non-zero standard error.
It's also much faster, and the detection of the singularity is clearer.
It's also much faster, and the detection of the singularity is clearer.