Question

I have ever read that int32_t is exact 32 bits long and int_least32_t only at least 32 bits, but they have both the same typedefs in my stdint.h:

typedef int  int_least32_t;

and

typedef int  int32_t;

So where is the difference? They exactly the same...

Was it helpful?

Solution

int32_t is signed integer type with width of exactly 32 bits with no padding bits and using 2's complement for negative values. int_least32_t is smallest signed integer type with width of at least 32 bits. These are provided only if the implementation directly supports the type.

The typedefs that you are seeing simply means that in your environment both these requirements are satisfied by the standard int type itself. This need not mean that these typedefs are the same on a different environment.

OTHER TIPS

Why do you think that on another computer with different processor, different OS, different version of C standard libs you will see exactly this typedefs?

This 2 types are exactly what you wrote. One of them is 32 bits exactly, another type is at least 32 bit. So one of the possible situations is when both of them are 32 bits and on your particular case you see it in stdint.h. On another system you may see that they are different.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top