Question

I'm facing a bit of an issue trying to communicate with 2 devices through a socket (UDP, but not the issue here...) One device is Android, the other is iOS.

Here's the Android code to create the Data I'm sending :

int part = 1;
int num = -1;

ByteBuffer bb = ByteBuffer.allocate(20);
bb.order(ByteOrder.nativeOrder());
bb.putChar('R');
bb.putInt(part);
bb.putInt(num);
bb.flip();

byte[] toSend = new byte[bb.remaining()]; // bb.remaining() = 10
bb.get(toSend);

send(toSend);

On iOS, here's the code I use to parse the data I get from the socket

char *sData = (char*)[data bytes];
if (sData[0] == 'R') {
        sData += sizeof(char);
        int part = 0;
        memcpy(&part, sData, sizeof(int));
        sData += sizeof(int);
        int num = 0;
        memcpy(&num, sData, sizeof(int));
}

The data I get is :

part = 256
num = -256

The weird thing is if I change value from the Android device I get :

part = 0 & num = -1 ---> part = 0 & num = -256
part = 1 & num = -1 ---> part = 256 & num = -256
part = 2 & num = -1 ---> part = 512 & num = -256
part = 3 & num = -1 ---> part = 768 & num = -256

As you can see, the value are correct, they are just multiplied by 256, and I can't understand why...

To avoid compatibility issue between Android and iOS (32/64) I avoided the long type, because on

Android long = 8
iOS 32b long = 4
iOS 64b long = 8

Does anybody can see why it is multiplying my value by 256 ?

EDIT 1 :

From iOS to iOS data transfer, it is working fine, so my guess is that the bug is coming from Android.

Was it helpful?

Solution

In Java, your buffer contains this:

char 2 bytes | int 4 bytes | int 4 bytes

Since iOS stores char in one byte, you are reading it as if it were:

char 1 byte | int 4 bytes | int 4 bytes

This also explains why iOS -> iOS works.

[Edit]

To fix, you could use byte in Java.

byte b = (byte) 'R';

Like this:

ByteBuffer bb = ByteBuffer.allocate(20);
bb.order(ByteOrder.nativeOrder());
bb.putByte((byte)'R');
bb.putInt(part);
bb.putInt(num);
bb.flip();

Note though that casting char to byte "loses" the high 8 bits so this will only work for ASCII chars.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top