質問

I'm developing a network application using Winsock API. Whenever I test the application to send a few bytes for example in an array:

char[0] = '1';
char[1] = '2';
char[2] = '3';
char[3] = '4';
char[4] = '\0';

I use the send and recv socket to receive the bytes on another application running on localhost then the byte order doesn't seem to be changed at all. Is it just because I'm testing it on localhost server? Will the network byte order change as:

char[0] = '\0';
char[1] = '4';
char[2] = '3';
char[3] = '2';
char[4] = '1';

if I receive it over the Internet?

役に立ちましたか?

解決

8-bit character arrays and strings do not get hit with network byte order issues. Since each individual char in the array is already in order with itself and sequenced with the next char byte in the array... It's the short, ints, and longs that get impacted.

Also, it's not the send and recv calls that do any byte swapping. Byte swapping is related to how the processor lays out integers in memory:

To understand byte order issues, consider the following code on an Intel chip:

// code that demonstrates how integer values are laid out in memory
int msg = 0x01020304;
char* buffer = (char*)&msg;
for (int x = 0; x < 4; x++)
    printf("%d %d %d %d\n", buffer[0], buffer[1], buffer[2], buffer[3]);

The result that gets printed out is:

4 3 2 1

What this shows is that x86 processors order the bytes of memory of integers in the opposite order you might expect. The least significant byte or an integer is sequenced first in memory. This is Little Endian.

On a Sparc chip such as old a Sun workstation, that same code prints out:

1 2 3 4

This is Big Endian.

It's only when you write out data to disk or network that the Endianness of the memory sequence matters. As soon as another computer reads the byte sequence written out by another computer, then the CPU reading the integer will interpret it based on it's own endianness.

Endianness typically matters on when writing integers to file or streaming to network. Without a conversion by the application developer, the raw buffer passed to send() gets transmitted exactly as the processor lays the bytes out in memory. See htonl and friends for more details on converting integers and shorts to and from Big Endian (aka Network byte order).

他のヒント

Endianness only comes into play when you look at a byte stream and decide to interpret the data.

For example if you've got the byte stream:

'1' '2' '3' '4' '0'

And you decided to interpret the data as a stream of characters then, assuming one byte per character, you've got an array containing the string "1234".

However, if you decide to interpret the data as an int then endianness comes into play as you've got more than one byte used to encode the data type (int). The values of the above stream have the hex values:

0x31 0x32 0x33 0x34 0x00

(0x31 is the hex value of '1' etc)

This is where the endianness comes into play. You need to know the endianness of the data in the stream so that you can successfully convert it back to the right value when you deserialize it.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top