Pergunta

The related and interesting fields of Information Theory, Turing Computability, Kolmogorov Complexity and Algorithmic Information Theory, give definitions of algorithmically random numbers.

An algorithmically random number is a number (in some encoding, usually binary) for which the shortest program (e.g using a Turing Machine) to generate the number, has the same length (number of bits) as the number itself.

In this sense numbers like $\sqrt{e}$ or $\pi$ are not random since well known (mathematical) relations exist which in effect function as algorithms for these numbers.

However, especially for $e$ and $\pi$ (which are transcendental numbers) it is known that they are defined by infinite power series.

For example $e = \sum_{n=0}^\infty \frac{1}{n!}$

So even though a number, which is the binary representation of $\sqrt{e}$, is not alg. random, a program would (still?) need the description of the (infinite) bits of the (transcendental) number $e$ itself.

Can transcendental numbers (really) be compressed?

Where is this argument wrong?

UPDATE:

Also note the fact that for almost all transcendental numbers, and irrational numbers in general, the frequency of digits is uniform (much like a random sequence). So its Shannon entropy should be equal to a random string, however the Kolmogorov Complexity, which is related to Shannon Entropy, would be different (as not alg. random)

Thank you

Nenhuma solução correta

Licenciado em: CC-BY-SA com atribuição
Não afiliado a cs.stackexchange
scroll top