Pergunta

Can an algorithm having a time complexity of O(n) have a space complexity of O(n2) or more than that?

Foi útil?

Solução

have a look at the DSPACE and DTIME groups, which indicate what algorithm can be done in which time/space complexity, and the relationship between groups.

all algorithms that use O(n) time are in the group DTIME(n).
all algorithms that use O(n^2) space, are in the group DSPACE(n^2).
since DTIME(n) <= NTIME(n) <= DSPACE(n) < DSPACE(n^2), so every algorithm that is O(n) time, is also O(n^2) space.

Outras dicas

The space complexity cannot be more than the time complexity because writing X units of space takes Omega(X) time.

Since all O(n) functions are trivially O(n2) (see, e.g., Wikipedia on Big O notation), the answer is "yes."

Big-O notation deals in upper bounds, technically speaking an algorithm is O( g(n) ) for any and all g(n) that grow asympoticaly faster than f(n), so if an algorithm is O(n) it must be O(n^2) and O(n^99).

Little-o notation deals in tonight upper bounds, i.e the least fastest growing set of functions which grow faster than f(n). Therefore its not valid to say f(n) is o(n^2) iff it is o(n).

Edit (to answer comment):

If given an algorithm A and being told reliably that A is O(n^2) then there is the possibility that A is O(n) (or whatever) but you would have to analyse A to find out yourself. Conversely, if reliably told A was o(n^2) it cannot be O(n).

To answer the question you probably meant to ask: generally, the accounting is such that allocating a given amount of memory takes a proportional amount of time. Why? well, practically speaking, something needs to initialize the memory before you use it.

Alternately, if you assume that all your memory comes pre-initialized, then this will not be the case after your procedure writes all over it; something would still need to clean up the memory afterwards...

There are actually a variety of processor models used in algorithm analysis; if you wanted, you could specify a model that says "pre-zeroed memory is free, and you don't have to clean up after yourself", which would yield a different metric for algorithms that use memory sparsely. However, in practice memory allocation and garbage collection are not free, so this metric would have limited practical relevance.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top