Question

I'm trying to come up with an algorithm that will allow me to generate a random N-dimensional real-valued vector that's linearly independent with respect to a set of already-generated vectors. I don't want to force them to be orthogonal, only linearly independent. I know Graham-Schmidt exists for the orthogonalization problem, but is there a weaker form that only gives you linearly independent vectors?

Was it helpful?

Solution

Step 1. Generate random vector vr.

Step 2. Copy vr to vo and update as follows: for every already generated vector v in v1, v2... vn, subtract the projection of vo on vi.

The result is a random vector orthogonal to the subspace spanned by v1, v2... vn. If that subspace is a basis, then it is the zero vector, of course :)

The decision of whether the initial vector was linearly independent can be made based on the comparison of the norm of vr to the norm of vo. Non-linearly independent vectors will have a vo-norm which is zero or nearly zero (some numerical precision issues may make it a small nonzero number on the order of a few times epsilon, this can be tuned in an application-dependent way).

Pseudocode:

vr = random_vector()
vo = vr
for v in (v1, v2, ... vn):
    vo = vo - dot( vr, v ) / norm( v )
if norm(vo) < k1 * norm(vr):
    # this vector was mostly contained in the spanned subspace
else:
    # linearly independent, go ahead and use

Here k1 is a very small number, 1e-8 to 1e-10 perhaps?

You can also go by the angle between vr and the subspace: in that case, calculate it as theta = arcsin(norm(vo) / norm(vr)). Angles substantially different from zero correspond to linearly independent vectors.

OTHER TIPS

A somewhat OTT scheme is to generate a NxN non-singular matrix, and use it's columns (or rows) as the N linearly independent vectors.

To generate a non=singular matrix one could generate it's SVD and multiply up. In more detail:

a/ generate a 'random' NxN orthogonal matrix U

b/ generate a 'random' NxN diagonal matrix S with positive numbers in the diagonal

c/ generate a 'random' NxN orthogonal matrix V

d/ compute

M = U*S*V'

To generate a 'random' orthogonal matrix U, one can use the fact that every orthogonal matrix can be written as a product of Household relectors, that is of matrices of the form

H(v) = I - 2*v*v'/(v'*v)

where v is a non zero random vector. So one could

initialise U to I
for( i=1..N)
   generate a none zero vector v
   update: U := H(v)*U

Note that if all these matrix multiplications become burdonesome, one could write a special routine to do the update of U. Applying H(v) to a vector u is O(N):

u -> u - 2*(h'*u)/(h'*h) * h

and so applying H to U can be done in O(N squared) rather than O( N cubed)

One advantage of this scheme is that one has some control over 'how linearly independent' the vectors are. The product of the diagonal elements is (up to sign) the determinant of M, so that if this product is 'very small' the vectors are 'almost' linearly dependent

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top