I'm studying for an Algorithms and Data Structure test. There is a type of question that is usually always asked by my professor but I don't know how to answer/solve it.

Question 1: An Algorithm with an worst-case execution time of 3n*(log n), being n the number of elements in the input, is:

  • a) An algorithm with an execution time of type Θ(n log n).
  • b) An algorithm with an execution time of type O(n log n).
  • c) An algorithm with an execution time of type O(n^2).
  • d) None of the above.

Question 2: An Algorithm with an execution time of 2^100 + (1/3)*n^2 + 100n, being n the number of elements in the input, is:

  • a) An algorithm with an execution time of type Θ(n^2).
  • b) An algorithm with an execution time of type O(2^n).
  • c) An algorithm with an execution time of type Θ(2^n).
  • d) None of the above.

I want to know how I can think about these problems in order to solve them. Any help is welcome (even by just giving the solution to these two questions). Thanks.

有帮助吗?

解决方案

Assuming that $f(n)$ and $g(n)$ are asymptotically positive (as it is usually the case) you can use the following sufficient condition to determine the asymptotic relation of $f(n)$ and $g(n)$.

If $\lim_{n \to \infty} \frac{f(n)}{g(n)}$ exists and is $c \in \mathbb{R_0^+} \cup \{+\infty\}$, then:

  • If $c< +\infty$ then $f(n) = O(g(n))$ (and $g(n) = \Omega(f(n))$). In particular:

    • If $0 < c < +\infty$ then $f(n) = \Theta(g(n))$ (and $g(n) = \Theta(f(n))$).
    • If $c=0$ then $f(n) = o(g(n))$ (and $g(n) = \omega(f(n))$).
  • If $c > 0$ then $f(n) = \Omega(n))$ (and $g(n) = O(f(n))$). In particular:

    • If $0 < c < +\infty$ then $f(n) = \Theta(g(n))$ (and $g(n) = \Theta(f(n))$).
    • If $c = +\infty$ then $f(n) = \omega(g(n))$ (and $g(n) = o(f(n))$).

Moreover, in computing the limit, you can replace $f(n)$ with a function $h(n)$ such that $h(n) \sim f(n)$ (see, e.g., this page on Wikipedia). The same holds for $g(n)$. For polynomials this amounts to taking the monomial of highest degree. Moreover, since scaling $c$ by a (positive) constant does not change the asymptotic relation between $f(n)$ and $g(n)$, you can also drop any multiplicative constant (which will always be positive since $f(n)$ and $g(n)$ are asymptotically positive).

For example, instead of comparing $f(n) = 3n^2 + 2n +50$ with $g(n) = 5n^5 + 4n^3 - 2^{10}$, you can compare $x^2$ with $x^5$ instead. This immediately shows that $c$ exists and is $0$, therefore $f(n) = O(g(n))$ and, in particular, $f(n) = o(g(n))$.

While the above rules will probably work for the vast majority of the functions you will encounter, they can't always be used. Consider, for example, $f(n) = 2 + \sin(n)$ and $g(n) = 1$. Here $f(n) = \Theta(g(n))$ but $\lim_{n \to \infty} \frac{f(n)}{g(n)}$ does not exist.

其他提示

Check the definitions. You'll see that as it is talking about the worst case of the algorithm, $\Theta(\cdot)$ is probably out. Think Bubblesort, with worst time complexity $\Theta(n^2)$ but best case $\Theta(n)$. In any case, if $T(n) = \Theta(n \log n)$, then certainly $T(n) = O(n \log n)$ (remember $T(n) = \Theta(g(n)$ if both $T(n) = \Omega(g(n))$ and $T(n) = O(g(n))$). Next, $3 n \log n = \Theta(n \log n)$, but we must consider it is only worst case $3 n \log n$, so you have $O(n \log n)$. But $3 n \log n = O(n^2)$ too.

The last two ones are right, the first one might be.

许可以下: CC-BY-SA归因
不隶属于 cs.stackexchange
scroll top