문제

I've got a problem here that's probably something that I'm just overlooking, but I can't understand why it's happening...

The problem I'm having is that I'm using the bit converter to give me an Int16 from a 2-byte array, but for some reason whenever I do this -- I get the number I should get, with 0xFFFF added to the beginning of the number.

Example...

byte[] ourArray = { 0x88, 0xA3, 0x67, 0x3D };
Int16 CreationDate = BitConverter.ToInt16(new byte[] {ourArray[2], ourArray[3]} , 0);
Int16 CreationTime = BitConverter.ToInt16(new byte[] {ourArray[0], ourArray[1]}, 0);

That will return with "CreationDate" being 0x3d67 (correct), but CreationTime being 0xffffa388.

Would anyone happen to know why this is happening, and a way to correct this?

도움이 되었습니까?

해결책

0xA388 is a negative Int16, so converted to Int32 will give a sign extended negative int with similar value. That 0xFFFF you see is the sign extension (padding with '1' bits). Better use UInt16 and UInt32.

다른 팁

0xffffa388 is not an Int16. Are you sure you're not casting it to some 32-bit type?

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top