Since the types of a
and c
have the same conversion rank, but a
is signed and c
is unsigned, a
is converted to unsigned int
before the division, in both a / c
and c / a
.
The compiler thus emits the unsigned division instruction div
for these cases (as well as c / d
, where both operands are unsigned).
The multiplication a * c
is also an unsigned multiplication. In this case the compiler can get away with using the signed multiplication instruction imull
, because the truncated result is identical regardless of whether mull
or imull
is used - only the flags are different, and the generated code doesn't test those.