Question

In Sipser's Introduction to the Theory of Computation he uses the notation $2^{\mathcal{O}(f(n))}$ to denote some asymptotic running time.

For example he says that the running time of a single-tape deterministic turing machine simulating a multi-tape non-deterministic turing machine is

$\mathcal{O}(t(n)b^{t(n)})=2^{\mathcal{O}(t(n))}$ where $b$ is the maximal number of options in the transtition function.

I was wondering if someone can clarify the exact definition of this for me:

My assumption is that if $g(n)=2^{\mathcal{O}(f(n))}$ then there exists $N \in \mathbb{Z}$ and $c \in \mathbb{R}$ s.t.

$g(n) \leq 2^{cf(n)}=(2^{f(n)})^c$ for all $n>N$.

Thank you

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top