Question

I'm struggling to understand what exactly T(n), and f(n) is in the above text:

When we compute the time complexity T(n) of an algorithm we rarely get an exact result, just an estimate. That’s fine, in computer science we are typically only interested in how fast T(n) is growing as a function of the input size n.

For example, if an algorithm increments each number in a list of length n, we might say: “This algorithm runs in O(n) time and performs O(1) work for each element”.

Here is the formal mathematical definition of Big O:

Let T(n) and f(n) be two positive functions. We write T(n) ∊ O(f(n)), and say that T(n) has order of f(n), if there are positive constants M and n₀ such that T(n) ≤ M·f(n) for all n ≥ n₀.

image

This graph shows a situation where all of the conditions in the definition are met.

I'm taking Tim Coughgarden course and reading his books. I'm still in the beginning and I can understand the explanations, but sometimes I struggle to understand the mathematical meaning of things... there are some implicitly about what is T(n) or f(n) (in this case) that is not explained at all.

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top