Question

I am sending data between a C TCP socket server and a C# TCP client.

From the C# client, the data being sent is an array of the .NET framework type System.Byte, which is an unsigned 8-bit integer.

From the C server, the data being sent is an array of the C type char, which is also an unsigned 8-bit integer.

From what I have read, endianness is an issue when dealing with 16+ bit integers, i.e. when you have more than 1 byte, then the order of bytes is either in Little Endian and Big Endian.

Since I am only transmitting 8-bit arrays, do I need to worry about endianness? I haven't been able to find a clear answer so far.

Thank you.

Was it helpful?

Solution

Your intuition is correct: endianness is irrelevant for 8-bit integers; it only comes into play for types that are wider than one byte.

OTHER TIPS

The hardware takes care of the bit endianness, or more specifically bit endianness is defined in the physical layer. For instance if your packets are transmitted over ethernet they will be transmitted low-bit first. But anyway after they have been received bytes will be reassembled the way you sent them by the physical layer.

Since you are dealing with a higher layer of the protocol-stack, you only have to care about byte endianness, which means you can not have problems with a 8 bits integer.

There are 2 kinds of endianness:

  • byte-endianness: the order of bytes within a multi-byte data
  • bit-endianness: the order of bits within a byte

Most of the time when you say endianness you say byte-endianness. That is because bit-endianness is almost always big-endian, but the byte-endianness varies across systems.

With byte-endianness you don't have to worry for data that is 1-byte width, as you suspect.

Can't say how much role does bit-endianness play in the tcp protocol. From the other answers here it looks like you don't have to worry about bit-endianness at the transport layer.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top