Question

While debugging I've come across this interesting behavior :

The hex value string of a is twice as long as the others.

enter image description here

Can you tell why is this happening ?

Was it helpful?

Solution

You are asking the debugger to evaluate the expression for you. It now acts like a compiler, converting the watch expression you entered into code and running that code to display the result. It thinks that 0xff000000 is a literal of type long, which is a fair call since int cannot store that value, it is larger than Int32.MaxValue. So it evaluates the >> operator with long arguments, converting the i value to long first. The result is of course long as well.

Since you didn't otherwise cast to smaller type, like you did in your code, the debugger displays the result (when switched to hexadecimal output) as a long with 64 bits, 16 hex digits.

The other expressions don't behave that way, the literals used in them are smaller than Int32.MaxValue so are evaluated with int arguments, producing a 32 bit result, 8 hex digits.

Notable perhaps is that the debugger's expression evaluator is close, but not identical to the C# compiler's evaluator. Not an issue here, but it can matter in some cases. This may change some day when the Roslyn project finally ships.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top