Question

I am unsure how to formally prove the Big O Rule of Sums, i.e.:

f1(n) + f2(n) is O(max(g1(n)),g2(n))

So far, I have supposed the following in my effort:

Let there be two constants c1 and c2 such that c2 > c1. By Big O definition:

f1(n) <= c1g1(n) and f2(n) <= c2g2(n)

How should I proceed? Is it reasonable to introduce numerical substitutions for the variables at this step to prove the relationship? Not knowing g or f, that is the only way I can think to approach.

Was it helpful?

Solution

Let

gmax = max(g1, g2), and gmin = min(g1, g2). 

gmin is O(gmax). Now, using the definition:

gmin(n) <= c*gmax(n) for n > some k

Adding gmax(n) to each side gives:

gmin(n) + gmax(n) <= c*gmax(n) + gmax(n) for n > some k
gmin(n) + gmax(n) <= (c+1)*gmax(n)       for n > some k
g1(n) + g2(n) <= c'*gmax(n)              for n > some k

So we have g1+g2 is O(max(g1, g2)).

Since f1+f2 is O(g1+g2), the transitive property of big-O gives us f1+f2 is O(max(g1, g2)). QED.

OTHER TIPS

I suppose I might be more of a constructivist, I'd attack the problem like this:

By the definition of Big-O, there exist positive c1, c2, N1, and N2 such that

f1(n) <= c1g1(n) for all n > N1

and

f2(n) <= c2g2(n) for all n > N2

Let:

N' = max(N1,N2)
c' = c1 + c2
g'(n) = max(g1(n),g2(n))

Then for all n > N' we have:

f1(n) <= c1g1(n) <= c1g'(n)
f2(n) <= c2g2(n) <= c2g'(n)
f1(n) + f2(n) <= c1g'(n) + c2g'(n) = c'g'(n)

Therefore, f1(n) + f2(n) is O(g'(n)) = O(max(g1(n),g2(n)))

You don't even need the definition - just divide both sides by the faster-growing function, take the limit in infinity, and the slower-growing function will approach zero (i. e. it's insignificant).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top