How do you convert bits into a different alphabet?
-
29-09-2020 - |
Question
I have forgotten how to do this. How do I figure out what the requirements are for a 128-bit string using a certain alphabet?
That is to say, I want to generate a UUID (128-bit) value, using only the 10 numbers for the alphabet. How many numbers do I need, and what is the general equation so I can figure this out for any alphabet of any size?
What is the equation for any n-bit value with any x-letter alphabet?
The way I do it is to guess and slowly iterate until I arrive at a close number. For powers of 10 it's easy:
Math.pow(2, 128)
3.402823669209385e+38
Math.pow(10, 39)
1e+39
For other numbers, it takes a little more guessing. Would love to know the equation for this.
Solution
To estimate the number of decimal digits needed to represent a $128$ bit number you use logarithms to base $10$:
$128 \times \log_{10}(2) \approx 38.53$
so you need $39$ decimal digits to represent a $128$ bit number.
In a general, for an “alphabet” with $n$ symbols you need to find the value of $128 \times \log_n (2)$ and then round this up to the next whole number.