Properties of roots of recurrence relations in the context of exponential algorithms in order to decrease the upper bound of the running time

cs.stackexchange https://cs.stackexchange.com/questions/123582

  •  29-09-2020
  •  | 
  •  

Question

The book "Exact Exponential Algorithms" by Fedor V. Fomin and Dieter Kratsch is an excellent book to start learning how to design exact exponential algorithms. In their second chapter, they introduce the recurrence relations in context of a branching algorithm:

$T(n) \leq T(n-t_1) + T(n-t_2) + \dots T(n-t_r)$

To solve this equation we assume $T(n) = c^n$, then $c$ should be a (complex) root of $x^n - x^{n-t_1}-x^{n-t_2}-\dots-x^{n-t_r}=0$ ⇾ the running time of this branching algorithm is thus governed by the largest (real) root of this equation. We call $\tau(t_1,t_2,\dots,t_r)$ the branching factor of these $t_r$ numbers which is thus the largest positive real root of the corresponding equation. The branching vector is defined as $t = (t_1,t_2,\dots,t_r)$

I am mainly interested in editing such branching algorithms in order to reduce the running time. Often, I get in a situation where I can either do:

(1) Remove a branch entirely (i.e. make the number of arguments in the branching factor lower)

(2) Have a choice between two branching vectors, say $x$ and $y$. The branching vectors are equal except for two elements: we have $x_i < y_i$ and $x_j > y_j$ with $i \neq j$. ($x_i$ is thus element at position $i$ of branching vector $x$).

I have two questions:

(1) Intuitively, it seems to me that if I remove a branch entirely, this lowers the running time of the algorithm. We essentially try to find the number of leaves of the branching algorithm; per definition if we remove an entire branch the number of leaves should lower, hence should the running time. However, I can find no such theorem/proof in the literature.

Mathematically, you essentially have a equation of the form $\sum a_i x^{n - c_i} = 0$, where we only care of the largest positive real root. Now if I create the same equation, but now lower one of the $a_i$ values, does my positive real root decrease? I assume this is some fundamental/elementary theorem somewhere but cannot seem to find it.

(2) The book has a Lemma (2.3): $\tau(i,j) \ge \tau(i + \epsilon, j - \epsilon)$ for all $0 \le i \le j$ and all $0 \le \epsilon \le \frac{j-i}{2}$. I cannot use this lemma in most cases of this problem. I am going to assume that I will just have to calculate the roots for each branch, right? And then take the minimum? Or is there some more fundamental theorem to this?

Was it helpful?

Solution

Let's start with your first question. You are considering two polynomials, $$ P(x) = x^n - a_r x^{n-t_r} - a_{r-1} x^{n-t_{r-1}} - \cdots - a_1 x^{n-t_1}, \\ Q(x) = x^n - a_r x^{n-t_r} - \cdots - b_i x^{n-t_i} - \cdots - a_1 x^{n-t_1}, $$ where all $a_i$ are positive, and $0 < b_i < a_i$. Thus $Q$ results from replacing the term $-a_i x^{n-t_i}$ with $-b_i x^{n-t_i}$.

Let $\tau$ be the largest real root of $Q(x)$; note that $\tau > 0$. Thus $$ P(\tau) = Q(\tau) + (b_i - a_i) \tau^{n-t_i} < 0. $$ On the other hand, $\lim_{x\to\infty} P(x) = \infty$. Therefore $P$ must have another root beyond $\tau$.

You can try attacking your second question using similar reasoning.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top