سؤال

I have a std::string that is base32 encoded and I have a function that decodes it. The function takes a char* input, a char* destination and a length. How do I know what length I will need for the destination? I need to know what array to allocate for the destination. How do I determine the size?

هل كانت مفيدة؟

المحلول

Base32 allows to encode each 5 bits ( as 32 = 2^5 ) using single character.

It means that you need output buffer size for encoding:

dst_size = src_size * 8 / 5 (1.6 times larger)

But as base32 string length must be multiple of 40 bits:

dst_size = (src_size * 8 + 4) / 5

Thus, for decoding (base32->binary) required buffer size is accordingly

dst_size = ceil( src_size / 1.6 )

نصائح أخرى

Actually, the encoded base32 string length is computed as follow :

ceil(bytesLength / 5.d) * 8

bytesLength / 5.f because we want to know how many chunks of 5 bytes we have, and ceil because 0.1 chunk is still 1 chunk

ceil(bytesLength / 5.f) * 8 because a chunk is made of 8 characters.

For the input data 'a' the encoded result will be ME====== because we have 1 chunk of 8 characters : two 5bits encoded characters (ME) 6 padding characters (======)

The same fashion, the decoded length is :

bytesLength * 5 / 8

But here bytesLength is not including the padding characters, thuse for ME====== bytelength is 2, giving 2 * 5 / 8 == 1 we only have 1 byte to decode.

For a visual explanation, see rfc4648 section 9 (page 11)

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top