Question

The only examples I've seen use bits as a measurement of entropy, but all these examples happen to use binary code alphabets. If we wanted to see how well a coding with a code alphabet of length n works, would we measure entropy in units of n?

Or would it make sense to stay using bits if we're comparing codings with binary and n-length code alphabets?

Was it helpful?

Solution

Generally bit was introduced by Shannon in 1948. Using bit instead of any other information measurement unit it's because of convenience any information can be represented by single bit or by set of bits.

I bet you already ran into most popular example which is coin tossing. You can represent head as 1 and tails as 0. You can also use stream of bits as representing more complex information as integer.

Simple example

Consider W as information about current week day you might use 7bit length stream as representation of week. Eg Monday would be represented as 1000000 and Wednesday as 0010000. So each bit in stream represents one day.

You should always consider domain for your problem and use bit encoding which fits best. Using code alphabet of length n shows universality of that particular encoding (like ASCII or Unicode).

Another thing is that bit can be efficently represented in low powered (5v) circuits. There exists interference detection and fixing functions for such circuits.

Consider domain different then bit, eg real numbers, and let them represent information using some relation R. Now because real numbers are dense even minimal interference will have impact on end result.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top