Question

I have some calculation that involves negative values:

row = (stagePosition - col) / PHNumRow;

Say stagePosition is -7 and col is 1. They are both NSInteger, including row. PHNumRow is 8.

If PHNumRow is NSInteger, I get the result I expect: -1. But if PHNumRow is NSUInteger, the result is garbage.

Why should it matter if the divisor is unsigned or signed? I'm not putting the result in an unsigned int.

Was it helpful?

Solution

Because of integer promotion. When the right hand side is evaluated, all arguments are promoted to the highest type of the operands of the expression which, if PHNumRow is unsigned, will be unsigned integer. The compiler does something similar to the following:

((NSUInteger)stagePosition - (NSUInteger)col) / PHNumRow;

Since, stagePosition is negative, there is a wraparound and your computation goes boom!

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top