Question

I am aware that to get a running time by recursion tree method, we need to draw a tree and find:

a) number of levels in tree.

Since left side of tree decreases by 1 in size, so it's longest path from root. Subproblem size at level i is n-i . setting n - i = 1 when it hits a size of 1, we get number of levels, i = n - 1.

b) cost per node in tree : cn

c) Number of nodes at level i: This is where i am stuck. Not able to find nodes at level i since left side decreases by 1, right side by half. Naturally, tree is more dense towards left side. Not every node will have two children.

if i can get answer to c, i can calculate T(n) = cost of level 0 + cost of level 1 + cost of level 2 + ... cost of level n-1. if y1 is number of nodes at level 1, y2 at level 2, etc... then
=> T(n) = cn + y1 * cn + y2 * cn + y3 * cn + .... yn-1 * cn to get total cost.

Can anyone guide me to the approach i am taking ? is it correct ? can i take an assumption that for sufficiently large n, we can ignore T(n/2) and then proceed ? .

Online searching confused me. Problem is 4.4-5 from CLRS.

Please see here

This solution says T(n) = O(2^n) and T(n) = omega(n^2) and does not explain how.

Also see here

This solution says T(n) = O(n^2). but contradicts with above solution

Was it helpful?

Solution

Let $S(n) = T(n) - 2n - 2$. You can check that $S(n) = S(n-1) + S(n/2)$ (ignoring the fact that $n/2$ need not be an integer). This shows that the additive $n$ term doesn't make a big difference.

For large $n$, we have roughly $S(n) - S(n-1) \approx S'(n)$, and so we are led to solve the differential equation $$ S'(n) = S(n/2). $$ Consider $f(n) = \exp (\tfrac{1}{2}\log_2^2 n)$. Then $$ f'(n) = \exp (\tfrac{1}{2}\log_2^2 n) \cdot \frac{\ln n}{(\ln 4)n}, $$ whereas $$ f(n/2) = \exp(\tfrac{1}{2} (\log_2 n - 1)^2) \approx \exp(\tfrac{1}{2}\log^2 n) \exp(-\log n) = \exp(\tfrac{1}{2} \log^2 n) \cdot \frac{1}{n}. $$ This suggests that, at the very least, $\ln S(n) = \Theta(\log^2 n)$.

Where does this come from? You can think of $S(n)$ (with an appropriate base case!) as the number of ways to go from $n$ to zero by applying two operations: subtract 1 and divide by 2. A "typical" such sequence will contain roughly $\log_2n$ many operations of the second type, out of $\Theta(n)$ operations in total, leading to the very rough estimate $\binom{\Theta(n)}{\log_2 n}$, which is also of the form $\exp \Theta(\log^2 n)$.


Consider for concreteness the following precise definition of $S(n)$: the base case is $S(0) = 1$, and for $n > 0$, $$ S(n) = S(n-1) + S(\lfloor n/2 \rfloor). $$ This is sequence A000123. Knuth, An almost linear recurrence, showed that $$ \log_4 S(n) \sim \log_4^2 n, $$ that is, the ratio of the two terms tends to 1 as $n \to \infty$. The OEIS entry contains even more precise asymptotics.

OTHER TIPS

I disagree with the Fibonacci hypnosis, but I'm not hundred percent sure of my answer

You see $ T(n)= T(n-1) +c*n = O(n ^2) $

$ T(n) = T(n/2) + c*n =O(n log n) $

However, u don't get ur T(n) by straightforward adding.

If u go one level in the recursion (u can draw a tree)

T(n) =T(n-2) +T((n-1)/2) +T(n/2-1)+ T(n/4) + [n + (n-1) + (n-1)/2 + (n/2-1) +n/4]

This to show u it is also not just $ O(n^2) $

I have to complete it, but I suspect either $ O((n^2) * log n) $ or $ O((n^2) * n ^ {log n} ) = O(n^3) $

{There is a term n(1+1/2+1/4+..1/n) that decays in log n steps, I have to check again}

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top