Question

Could someone explain me why the following code:

#include <iostream>
#include <bitset>

int main()
{
    unsigned char i = 2;
    std::cout<<std::bitset<8>((~static_cast<unsigned char>(0)) << i)<<std::endl;
    std::cout<<std::bitset<8>((~static_cast<unsigned char>(0)) >> i)<<std::endl;
    return 0;
}

Produces:

11111100
11111111

and not:

11111100
00111111
Was it helpful?

Solution

Before ~ is done static_cast<unsigned char>(0) is converted to int (integer promotion happens), so after ~ it becomes all-one bits int. This then is shifted and truncated to 8 bits in bitset.

OTHER TIPS

On right-shifts, signed values are zero-filled on if the most significant bit is 0, and one-filled if the most significant bit is 1.

Using unsigned values forces zero-filling on right shifts.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top