Question

I am implementing a C# application which reads binary data from a microcontroller at a high baudrate (8 MegaBaud) using an USB-Serial adapter (FTDI FT232H). The problem is when the stream contains 0x1A, sometimes a big chunk of data (thousands of bytes) is lost after this byte.

I found on the forums that 0x1A is a special character (EOFCHAR) and Windows is doing a special handling for this. However this solution of decompiling SerialStream and changing EOFCHAR to another byte value doesn't help me, I need to use the entire byte range (0..255).

I created a small test application to reproduce my problem, sending repeatedly the same byte, using 2 serial adapters connected to the same computer and FT232R-TX connected to FT232H-RX

using System;
using System.IO.Ports;
using System.Collections.Generic;
using System.Text;

namespace SerialPort_0x1A_Loss_Test
{
    class Program
    {
        static void Main(string[] args)
        {
            const byte BYTE_FILL = 0x1A;    // 0x1A is EOFCHAR
            const int  BAUD_RATE = 3000000; // 3 MegaBaud
            const int  BUFF_SIZE = 1000000;

            SerialPort sp1_FT232R = 
               new SerialPort("COM3", BAUD_RATE, Parity.None, 8, StopBits.One);
            SerialPort sp2_FT232H = 
               new SerialPort("COM6", BAUD_RATE, Parity.None, 8, StopBits.One);

            sp1_FT232R.Encoding = Encoding.GetEncoding(1252);
            sp1_FT232R.WriteBufferSize        = 20000000;
            sp1_FT232R.Open();

            sp2_FT232H.Encoding = Encoding.GetEncoding(1252);
            sp2_FT232H.ReadBufferSize         = 20000000;
            sp2_FT232H.ReceivedBytesThreshold = 20000000;
            sp2_FT232H.Open();

            byte[] bufferTx = new byte[BUFF_SIZE];
            for (int i = 0; i < BUFF_SIZE; i++)
            {
                bufferTx[i] = BYTE_FILL;
            }

            Console.WriteLine("Sending ...");
            sp1_FT232R.Write(bufferTx, 0, BUFF_SIZE);
            Console.WriteLine("Sending finished. " +
                "Press a key to view status, ESC to exit.");

            // Receiving maybe not yet finished, 
            // query the status with a keypress
            while (Console.ReadKey(true).Key != ConsoleKey.Escape)
            {
                Console.WriteLine("TOTAL RX = " + sp2_FT232H.BytesToRead);
            }

            // A second test, using .Read() call
            // This will be executed after pressing ESC for the previous test
            int totalRX_Read_Call = 0;
            var listBufferRx = new List<byte>();
            int btr; // BytesToRead
            while ( (btr = sp2_FT232H.BytesToRead) > 0)
            {
                var bufferRx = new byte[btr];
                totalRX_Read_Call += sp2_FT232H.Read(bufferRx, 0, btr);
                listBufferRx.AddRange(bufferRx);
                Console.WriteLine("totalRX_Read_Call = " + totalRX_Read_Call + 
                    ";  listBufferRx.Count = " + listBufferRx.Count);
            }
            Console.ReadKey();

            sp1_FT232R.Close();
            sp2_FT232H.Close();
        }
    }
}

Test results (BUFF_SIZE = 1000000 for all tests):

1.  BYTE_FILL = 0x55; BAUD_RATE = 3000000; TOTAL RX = 1000000 ( no loss)
2.  BYTE_FILL = 0x1A; BAUD_RATE = 3000000; TOTAL RX =  333529 (66% loss)
3.  BYTE_FILL = 0x1A; BAUD_RATE = 2000000; TOTAL RX =  627222 (37% loss)
4.  BYTE_FILL = 0x1A; BAUD_RATE = 1000000; TOTAL RX = 1000000 ( no loss)

Also, the load on the CPU (i7-4770k at 4 GHz) is high (over 30%) for tests like 2, 3, 4, but low (3%) for test 1. For test 1, I tried with all other patterns for a byte (0x00..0x19, 0x1B..0xFF) and there is no loss.

Do you know if there is a solution for this? Thank you very much!

Was it helpful?

Solution

First of all, I can't explain why receiving that character would cause bytes to be lost. While you might think the problem is setting the EOF character, that doesn't quite make sense.

The documentation for the DCB (device control block) structure indicates that EofChar is "The value of the character used to signal the end of data", but does not say what that means. There is no other reference to the mysterious EofChar anywhere else that I can find. Furthermore, the same page says this about the fBinary member: "If this member is TRUE, binary mode is enabled. Windows does not support nonbinary mode transfers, so this member must be TRUE."

What's the connection? Well, kb101419 states how it used to work in 16-bit Windows:

    fBinary - If fBinary is set to zero, reception of the EofChar
        character indicates the end of the input stream. ReadComm()
        will not return any characters past EofChar. If any characters
        are received after EofChar, it will be treated as overflowing
        the receive queue (CE_RXOVER). The reception of EofChar is
        indicated in the COMSTAT status flag CSTF_EOF. If fBinary is
        set to one, the EofChar character has no special meaning.

In other words, the EofChar is only used when fBinary is zero, yet Windows no longer supports that mode, thus it seems that EofChar is ignored.

So what is causing 0x1A to be treated as a special character? The DCB has another member called EvtChar, defined as The value of the character used to signal an event. When this character is received on the port, the port's event is signaled and the EV_RXFLAG bit is set for that port. The SerialData enum defines Eof = NativeMethods.EV_RXFLAG, which is why there is some confusion as to what EOF means.

OK, but that doesn't explain why that character causes data loss. I couldn't find any documentation about it, but my guess is that when the EvtChar is received on the port, the event is signaled and no more data is buffered until that event is cleared. At low data rates the event is cleared before another byte is received, so nobody ever noticed the problem before. At high data rates, potentially thousands of bytes may be received during this period. And if this is actually an aspect of the serial port driver, the behavior may be hard to reproduce on other systems.

Now the problem is how to disable this behavior. The SerialStream class always sets the EvtChar to 0x1A, but that doesn't matter because changing it to a different byte just moves the problem rather than fixing it. I believe the actual problem is caused by calling SetCommMask with the EV_RXFLAG bit (0x0002) set. Unfortunately this flag is always set regardless of whether you are listening on the event, meaning that the driver will always need to signal it.

I suspect that you can solve your problem by calling SetCommMask with that bit cleared. The default for SerialStream is 0x1fb (all bits but EV_TXEMPTY). Since EV_RXFLAG is 0x002, you can pass in 0x1F9 to clear it.

The P/Invoke signature for SetCommMask is:

using Microsoft.Win32.SafeHandles;
using System.Runtime.InteropServices;

    [DllImport("Kernel32.dll", SetLastError=true, CharSet=CharSet.Auto)]
    static extern bool SetCommMask(
        SafeFileHandle hFile, 
        int dwEvtMask 
    );

To get the hFile you will have to use Reflection to get the _handle field of sp1_FT232R.BaseStream:

var _handle = (SafeFileHandle)sp1_FT232R.BaseStream.GetType()
              .GetField("_handle", BindingFlags.NonPublic | BindingFlags.Instance)
              .GetValue(sp1_FT232R.BaseStream);
SetCommMask(_handle, 0x1F9);

OTHER TIPS

You might consider using a different encoder. I like to use the windows-1252 encoder as I handle small (100 byte) datastreams from 8bit MCUs frequently.

http://msdn.microsoft.com/en-us/library/aa332096(v=vs.71).aspx

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top