Question

Possible Duplicate:
Addition of two chars produces int

Given the following C++ code:

unsigned char a = 200;
unsigned char b = 100;

unsigned char c = (a + b) / 2;

The output is 150 as logically expected, however shouldn't there be an integer overflow in the expression (a + b)?

Obviously there must be an integer promotion to deal with the overflow here, or something else is happening that I cannot see. I was wondering if someone could enlighten me, so I can know what it is I can and shouldn't rely on in terms of integer promotion and overflow.

Was it helpful?

Solution

Neither C++ not C perform arithmetical computations withing "smaller" integer types like, char and short. These types almost always get promoted to int before any further computations begin. So, your expression is really evaluated as

unsigned char c = ((int) a + (int) b) / 2;

P.S. On some exotic platform where the range of int does not cover the range of unsigned char, the type unsigned int will be used as target type for promotion.

OTHER TIPS

No, this is not an error.

The compiler always calculates at minimum of integer precision, the result will be converted back to unsigned char on assignment only.

This is in the standard.

Per other answers, it's not an error on x86 and other (sane) 32bit and 16bit architectures.

However, on smaller or less sane architectures (typically very small microcontrollers) things like this will probably start to cause trouble, especially if whoever implemented your compiler doesn't have the testing/validation budget of some of the larger companies out there (again, microcontrollers).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top