Question

My issue is that when i'm streaming a continuous stream of data over a LOCAL LAN network sometimes random bytes gets lost in the process.

As it is right now the code is set up to stream about 1027 bytes or so ~40 times a second over a lan and sometimes (very rare) one or more of the bytes are lost.

The thing that baffles me is that the actual byte isn't "lost" it is just set to 0 regardless of the original data. (I'm using TCP by the way)

Here's the sending code:

    public void Send(byte[] data)
    {
        if (!server)
        {
            if (CheckConnection(serv))
            {
                serv.Send(BitConverter.GetBytes(data.Length));
                serv.Receive(new byte[1]);
                serv.Send(data);
                serv.Receive(new byte[1]);
            }
        }
    }

and the receiving code:

    public byte[] Receive()
    {
        if (!server)
        {
            if (CheckConnection(serv))
            {
                byte[] TMP = new byte[4];
                serv.Receive(TMP);
                TMP = new byte[BitConverter.ToInt32(TMP, 0)];
                serv.Send(new byte[1]);
                serv.Receive(TMP);
                serv.Send(new byte[1]);
                return TMP;
            }
            else return null;
        }
        else return null;
    }

The sending and receiving of the empty bytes are just to keep the system in sync sorta. Personally i think that the problem lies on the receiving side of the system. haven't been able to prove that jet though.

Was it helpful?

Solution

Just because you give Receive(TMP) a 4 byte array does not mean it is going to fill that array with 4 bytes. The Receive call is allowed to put in anywhere between 1 and TMP.Length bytes in to the array. You must check the returned int to see how many bytes of the array where filled.

Network connections are stream based not message based. Any bytes you put on to the wire just get concatenated in to a big queue and get read in on the other side as it becomes available. So if you sent the two arrays 1,1,1,1 and 2,2,2,2 it is entirely possible that on the receiving side you call Receive three times with a 4 byte array and get

  • 1,1,0,0 (Receive returned 2)
  • 1,1,2,2 (Receive returned 4)
  • 2,2,0,0 (Receive returned 2)

So what you need to do is look at the values you got back from Receive and keep looping till your byte array is full.

byte[] TMP = new byte[4];

//loop till all 4 bytes are read
int offset = 0;
while(offset < TMP.Length)
{
    offset += serv.Receive(TMP, offset, TMP.Length - offset, SocketFlags.None);
}
TMP = new byte[BitConverter.ToInt32(TMP, 0)];

//I don't understand why you are doing this, it is not necessary.
serv.Send(new byte[1]); 

//Reset the offset then loop till TMP.Length bytes are read.
offset = 0;
while(offset < TMP.Length)
{
    offset += serv.Receive(TMP, offset, TMP.Length - offset, SocketFlags.None);
}

//I don't understand why you are doing this, it is not necessary.
serv.Send(new byte[1]);

return TMP;

Lastly you said "the network stream confuses you", I am willing to bet the above issue is one of the things that confused you, going to a lower level will not remove those complexities. If you want these complex parts gone so you don't have to handle them you will need to use a 3rd party library that will handle it for you inside the library.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top