Does ANSI C actually specify which bytes are used when typecasting to a smaller integer?

StackOverflow https://stackoverflow.com/questions/18092448

  •  23-06-2022
  •  | 
  •  

Question

I'm reading through an email argument regarding the following line of code:

p = (unsigned char)random();

The random function returns a long, and somebody says that this is unsafe because it's possible that the typecast might take the MSB instead of the LSB. I know that on x86 the typecast would return LSB, but I can't find any information as to whether this is actually mandated by ANSI C or if it's one of those implementation-specific "undefined behaviors".

Was it helpful?

Solution

This is specified in the C Standard.

C99 in 6.3.1.3p2 says:

"Otherwise, if the new type is unsigned, the value is converted by repeatedly adding or subtracting one more than the maximum value that can be represented in the new type until the value is in the range of the new type."

On a two's complement system, it means taking the least significant bits.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top