Question

This is a very simple question. I noticed that the following, when compiled in MSVS2012, produces the expected result of 0x3412 for val:

unsigned char test[] = { 0x12, 0x34, 0x56, 0x78 };
unsigned char* ch = test;
unsigned int val = *ch | (*(ch+1) << 8);

I would have actually expected the dereferenced char pointer *(ch+1) on the right to produce a char value of 0x34, which would then be shifted left 8 bits producing 0x00. It seems that at the point in time the value is dereferenced, it is already stored in a type large enough to contain at least two bytes.

Is this specified in the C++ standard somewhere? How exactly does this implicit cast happen?

Was it helpful?

Solution

This is covered under the draft C++ standard section 5.8 Shift operators paragraph 1 which says:

[...]The operands shall be of integral or unscoped enumeration type and integral promotions are performed.[...]

and integral promotions are covered in section 4.5 Integral promotions which in paragraph 1 says:

A prvalue of an integer type other than bool, char16_t, char32_t, or wchar_t whose integer conversion rank (4.13) is less than the rank of int can be converted to a prvalue of type int if int can represent all the values of the source type; otherwise, the source prvalue can be converted to a prvalue of type unsigned int.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top