Question

I read that a reasonable encoding of inputs is one where the length of the encoding is no more than a polynomial of the 'natural representation' of the input. For instance, binary encodings are reasonable, but unary encodings are not.

But say that the input is a graph, and its natural representation is a vertex and edge list. Suppose that the graph has $k$ vertices. If I use unary to encode, the overall length of the input referring to the vertex list would be $O(k^2)$, i.e. $=|1^1|+|1^2|+|1^3|+...+|1^k|$. Isn't this unary encoding still a polynomial with respect to the number of vertices of the graph (which is $k$)?

What am I missing here?

Was it helpful?

Solution

Unary encoding for values 0 <= k <= N takes O(N) space. Unary encoding of an n-bit number takes $\Theta(2^n)$ space. See the difference?

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top