Question

Imagine that you had all the supercomputers in the world at your disposal for the next 10 years. Your task was to compress 10 full-length movies losslessly as much as possible. Another criteria was that a normal computer should be able to decompress it on the fly and should not need to spend much of his HD to install the decompressing software.

My question is, how much more compression could you achieve than the best alternatives today? 1%, 5%, 50%? More specifically: is there a theoretical limit to compression, given a fixed dictionary size (if it is called that for video compression as well)?

Was it helpful?

Solution

The limits of compression are dictated by the randomness of the source. Welcome to the study of information theory! See data compression.

OTHER TIPS

There is a theoretical limit: I suggest reading this article on Information theory and the pigeon hole principle. It seems to sum up the issue in a very easy to understand way.

If you have a fixed catalogue of all the movies you were ever going to compress, you could just send an id for the movie and have the "decompression" lookup up the data with that index. So compression could be to a fixed size of log2(N) bits, where N was the number of movies.

I suspect the practical lower bound is rather higher than this.

Do you really mean lossless? Most of today's video compression is lossy, I thought.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top