Question

When I work on an algorithm to solve a computing problem, I often experience that speed can be increased by using more memory, and memory usage can be decreased at the price of increased running time, but I can never force the product of running time and consumed memory below a clearly palpable limit. This is formally similar to Heisenberg's uncertainty principle: the product of the uncertainty in position and the uncertainty in momentum of a particle cannot be less than a given threshold.

Is there a theorem of computer science, which asserts the same thing? I guess it should be possible to derive something similar from the theory of Turing Machines.

Was it helpful?

Solution

I am not personally familiar with a description of it that parallels Heisenberg's uncertainty principle, per se, but this sounds to me to be closely related to Computational Complexity Theory. Problems can be classified according to some inherent, irreducible complexity, and I think that's what you're getting at with your limit of "the product of running time and consumed memory".

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top