Is there a way to store an arbitrarily big BigInt in a bit sequence, only later to convert it into a standard BigInt structure?

cs.stackexchange https://cs.stackexchange.com/questions/128035

  •  29-09-2020
  •  | 
  •  

Question

I am trying to imagine a way of encoding a BigInt into a bit stream, so that it is literally just a sequence of bits. Then upon decoding this bit stream, you would generate the standard BigInt sort of data structure (array of small integers with a sign). How could you encode the BigInt as a sequence of bits, and how would you decode it? I don't see how to properly perform the bitwise manipulations or how to encode an arbitrary number in bits larger than 32 or 64. If a language is required then I would be doing this in JavaScript.

For instance, this takes bytes and converts it into a single bit stream:

function arrayOfBytesTo32Int(map) {
  return map[0] << 24
    | map[1] << 16
    | map[2] << 8
    | map[3]
}

How would you do that same sort of thing for arbitrarily long bit sequences?

Was it helpful?

Solution

Look at Elias delta or gamma coding as an example.

OTHER TIPS

There are many common encodings for arbitrary-length integers. I would say that the most-used representations are:

  • Length-data representations, where the number of bytes/words is written first, followed by the data.
  • A "continuation bit" representation. If the word size is b bits, then an integer is split into groups of b-1 bits, with the high order bit denoting whether there is another byte following or not.

You can mix and match these. The ASN.1 Basic Encoding Rules are a case in point, where in the general case, the length field is encoded in base 128 using the continutation bit. ASN.1 BER is used in many network protocols, such as VoIP, SNMP, and LDAP, as well as in cryptography, where representing large integers is a common thing.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top