Question

While studying for a class in computer networks, the prof talked about the hamming distance between 2 valid code words in a sample code. I have read about hamming distance, and it makes sense from the perspective of telling the difference distance between 2 strings. For example:

Code Word 1 = 10110 

The sender sends code word 1, and there is an error introduced, and the receiver receives 10100. So you see that the 4th bit was corrupted. This would result in the a hamming distance of 1 because:

Valid Code Word: 10110
Error Code Word: 10100
                 -----
XOR              00010

The XOR of the 2 strings results in one 1, so the hamming distance is 1. I understand it up to that point. But then the prof asks:

  • What is the hamming distance of the standard CRC-16 bit protocol?
  • What is the hamming distance of the standard CRC-32 bit protocol?

I'm a bit confused, and was wondering if someone could help. Thanks.

Was it helpful?

Solution

You probably figured it out by now, but what he asked for was most likely the minimum number of bit errors that a CRC code would not detect. The answer depends on the width, the polynomial and the length of the message. For instance, the best known CRC-32 polynomial (0x1EDC6F41) has a Hamming distance of 6 or better for messages of up to 5,275 bits (Castaglioni, Bräuer, Herrmann: Optimization of Cyclic Redundancy-Check Codes with 24 and 32 Parity Bits, IEEE Transactions on Communications, vol 41 no 6, June 1993) which means it is guaranteed to detect up to 5 flipped bits in a single message of 5,275 bits or less.

BTW, the code word includes the checksum, so your example is incorrect.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top