I was passing an integer rather than a long, which is half of the bytes, therefore it was doubling the output as it was adding two sets of bytes together!
Decoding DataOutputStream Byte array - Bit-Shift producing odd results
Frage
I've been looking at the java-source in order to work out how java encodes and decodes its byte arrays when using a DataOutputStream in conjunction with DataInputStream. (I'm writing a framework in C# which will be able to decode the results of a Java DataOutputStream).
The code for the encoding of a long to a stream of bytes is:
public final void More ...writeLong(long v) throws IOException {
writeBuffer[0] = (byte)(v >>> 56);
writeBuffer[1] = (byte)(v >>> 48);
writeBuffer[2] = (byte)(v >>> 40);
writeBuffer[3] = (byte)(v >>> 32);
writeBuffer[4] = (byte)(v >>> 24);
writeBuffer[5] = (byte)(v >>> 16);
writeBuffer[6] = (byte)(v >>> 8);
writeBuffer[7] = (byte)(v >>> 0);
out.write(writeBuffer, 0, 8);
incCount(8);
}
I am aware that this is using a series of Zero-Fill Right Shift's in order to convert the long into a series of 8 bytes.
In order to convert these back inside DataInputStream:
public final long More ...readLong() throws IOException {
readFully(readBuffer, 0, 8);
return (((long)readBuffer[0] << 56) +
((long)(readBuffer[1] & 255) << 48) +
((long)(readBuffer[2] & 255) << 40) +
((long)(readBuffer[3] & 255) << 32) +
((long)(readBuffer[4] & 255) << 24) +
((readBuffer[5] & 255) << 16) +
((readBuffer[6] & 255) << 8) +
((readBuffer[7] & 255) << 0));
}
Implementing these two methods (Seen here in my ideone script: code) Seems to double the long. (You can probably see here i have removed the 5 (long) casts from the read method as these were causing the bit shift to happen in the wrong place)
My question is, as i am struggling to get my head around these bit shift operations, what is happening here? Is the way i have implemented it in some way different to the java-source?
Lösung