Question

I've got a problem here that's probably something that I'm just overlooking, but I can't understand why it's happening...

The problem I'm having is that I'm using the bit converter to give me an Int16 from a 2-byte array, but for some reason whenever I do this -- I get the number I should get, with 0xFFFF added to the beginning of the number.

Example...

byte[] ourArray = { 0x88, 0xA3, 0x67, 0x3D };
Int16 CreationDate = BitConverter.ToInt16(new byte[] {ourArray[2], ourArray[3]} , 0);
Int16 CreationTime = BitConverter.ToInt16(new byte[] {ourArray[0], ourArray[1]}, 0);

That will return with "CreationDate" being 0x3d67 (correct), but CreationTime being 0xffffa388.

Would anyone happen to know why this is happening, and a way to correct this?

Was it helpful?

Solution

0xA388 is a negative Int16, so converted to Int32 will give a sign extended negative int with similar value. That 0xFFFF you see is the sign extension (padding with '1' bits). Better use UInt16 and UInt32.

OTHER TIPS

0xffffa388 is not an Int16. Are you sure you're not casting it to some 32-bit type?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top