What is the best way to calculate sum of matrices such as A^i + A^(i+1) + A^i+2........A^n for very large n?

I have thought of two possible ways:

1) Use logarithmic matrix exponentiation(LME) for A^i, then calculate the subsequent matrices by multiplying by A.

Problem: Doesn't really take advantage of the LME algorithm as i am using it only for the lowest power!!

2)Use LME for finding A^n and memoize the intermediate calculations.

Problem: Too much space required for large n.

Is there a third way?

有帮助吗?

解决方案

Notice that:

A + A^2 = A(I + A)
A + A^2 + A^3 = A(I + A) + A^3
A + A^2 + A^3 + A^4 = (A + A^2)(I + A^2)
                    = A(I + A)(I + A^2)

Let

B(n) = A + ... + A^n

We have:

B(1) = A
B(n) = B(n / 2) * (I + A^(n / 2)) if n is even
B(n) = B(n / 2) * (I + A^(n / 2)) + A^n if n is odd

So you will be doing a logarithmic number of steps and there is no need to compute inverses.

While a direct implementation will lead to a (log n)^2 factor, you can keep it at log n by computing the powers of A as you compute B.

其他提示

You can use the fact that the sum to n of a geometric series of matrices equals:

S_n = (I-A)^(-1) (I-A^n)

and, since you don't start from 0, you can simply calculate:

result = S_n - S_i

where i is your starting index.

Why not just diagonalize the matrix to make the multiplication cheap.

edit:

As long as the matrix is nonsingular you should be able to find a diagonal representation D of matrix A such that A = PDP^-1 where P is made up of the eigenvectors of A, and D has the eigenvalues of A along the diagonal. Getting D^m = D*D^(m-1) is cheap since it's you're only multiplying along the diagonal (i.e. the same number of multiplications as the dimension of the matrix)

Getting S(m)=S(m-1)+D^m is also cheap since you're only adding diagonal elements.

Then you have

A^i + A^(i+1) + A^i+2........A^n = P(D^i + D^(i+1) + D^i+2........D^n)P^-1 = P( S(n) - S(i) )P^-1

The only difficult thing is finding P and P^-1

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top