Question

I'm really getting frustrated here. Trying to implement the CRC-CCITT algorithm and I found a very nice example on an Internet site.

There is one line whose output I completely don't understand:

unsigned short update_crc_ccitt( unsigned short crc, char c){
[...]
short_c  = 0x00ff & (unsigned short) c;
[...]
}

I want to calculate the CRC of the "test" string "123456789". So in the first run the char 'c' is 1. From my understanding short_c from the first run should be equal to 1 as well, but when I print it to the console, I get short_c = 49 for c = 1. How?

0x00ff in binary is: 1 1 1 1 1 1 1 1 
char 1 in binary is: 0 0 0 0 0 0 0 1
bitand should be   : 0 0 0 0 0 0 0 1 

Where is my mistake?

Was it helpful?

Solution

The character 1 has ASCII code 0x31 = 49. This is different from the character with ASCII code 1 (which is ^A).

OTHER TIPS

You are confusing characters and numbers, basically. The first letter in the string "123456789"is the character '1', whose decimal value on most typical computers is 49.

This value is decided by the encoding of the characters, which describes how each character is assigned a numerical value which is what your computer stores.

C guarantees that the encoding for the 10 decimal digits will be in a compact sequence with no gaps, starting with '0'. So, you can always convert a character to the corresponding number by doing:

const int digitValue = digit - '0';

This will convert the digit '0' to the integer 0, and so on for all the digits up to (and including) '9'.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top