Question

I've been asked to develop a recursive function and then analyze the asymptotic time complexity.

f(N) = 0, if N < N1

f(N1) = C1

f(N)= A1 + M1*f(M2*N/D1 - S1) Op M3*f(M4*N/D2 - S2), if N > N1

We're to assume that:

s1 = s2 = 0

m2 = m4 = 1

d1 = d2 > 1

//the user enters N1 C1 A1 M1 M2 M3 M4 D1 D2 S1 S2 and then ARG

int Recursion_Plus(int ARG)
{

    if (ARG < n1)
    {
        return 0;
    }

    else if(ARG == n1)
   {
    return c1;
   }

   else if(ARG > n1 )
   {

    return a1 + m1 
    * 
    Recursion_Plus(m2*ARG/d1 - s1) 
    + 
    m3*Recursion_Plus(m4*ARG/d2 - s2);

   }

}

I've tested my recursive function against the instructor's program and it works exactly the same so I've moved on to my analysis, where I've hit a wall.

I'm struggling to wrap my brain around this so bear with me please.

My attempt at a partial solution:

the 2 comparisons (if ARG < N1 & if ARG == N1) take 1 unit of time

a1 & m1 & m3 are insignificant because they're outside the recursive call

a1 + m1*_ = 1 unit of time (addition)

m1*_ = 1 unit of time (multiplication)

adding the 2 recursive calls together is 1 unit of time

m3*_ = 1 unit of time (multiplication)

From the instructions we're given, both recursive functions will be called using the same # every time, and every successive number that the recursive function calls will be smaller than the last because d1 = d2 > 1.

So the larger ARG is (in comparison to n1), the longer it takes to reach the base case and the larger the result will be. So the algorithm takes O(ARG) time?

I'd appreciate it if anyone could let me kno if I'm on the right track. Thanks

Was it helpful?

Solution

the recursive call is :

f(N)= A1 + M1*f(M2*N/D1 - S1) Op M3*f(M4*N/D2 - S2), if N > N1

with

s1 = s2 = 0
m2 = m4 = 1
d1 = d2 > 1

we have

f(N)= A1 + M1*f(N/D1) Op M3*f(N/D1), if N > N1

The recursive call is the key point to get the asymptotic complexity, the rest is "just" constants.

So the point is to find T such as :

T(n)=2*T(n/D)

Once you find T(n) you have the number of call of your Recursion_Plus, since we are talking about asymptotic complexity it is not relevant to bother about the last calls (i.e. n<N1).

Now it's all about mathematics, i won't describe a formal solution here but with a little intuition you can get to the result.

Each call of T induce 2 calls of T but with a # divide by D, then 4 call with a # divide by D^2 ...

the complexity is 2^(logD(n)) (with logD(n)=ln(N)/ln(D) )

particular case : with D=2, the complexity is n

OTHER TIPS

Notice that on each level of recursion you call the function twice. So, from first call c1 it calls itself twice: c21 and c22, then from each of those calls it calls itself twice again: c211, c212, c221 and c222, etc. At each level of recursion you have twice more calls. At N-th level you'll have 2^n calls, so it is exponential complexity.

Edit: Sorry, my bad. I didn't notice that the argument is divided there. In that case there won't be N levels, there will be only logd(N) and the rest is as Tony wrote.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top