First of all, I can't explain why receiving that character would cause bytes to be lost. While you might think the problem is setting the EOF character, that doesn't quite make sense.
The documentation for the DCB
(device control block) structure indicates that EofChar
is "The value of the character used to signal the end of data", but does not say what that means. There is no other reference to the mysterious EofChar
anywhere else that I can find. Furthermore, the same page says this about the fBinary
member: "If this member is TRUE, binary mode is enabled. Windows does not support nonbinary mode transfers, so this member must be TRUE."
What's the connection? Well, kb101419 states how it used to work in 16-bit Windows:
fBinary - If fBinary is set to zero, reception of the EofChar character indicates the end of the input stream. ReadComm() will not return any characters past EofChar. If any characters are received after EofChar, it will be treated as overflowing the receive queue (CE_RXOVER). The reception of EofChar is indicated in the COMSTAT status flag CSTF_EOF. If fBinary is set to one, the EofChar character has no special meaning.
In other words, the EofChar
is only used when fBinary
is zero, yet Windows no longer supports that mode, thus it seems that EofChar
is ignored.
So what is causing 0x1A
to be treated as a special character? The DCB has another member called EvtChar
, defined as The value of the character used to signal an event
. When this character is received on the port, the port's event is signaled and the EV_RXFLAG
bit is set for that port. The SerialData
enum defines Eof = NativeMethods.EV_RXFLAG
, which is why there is some confusion as to what EOF means.
OK, but that doesn't explain why that character causes data loss. I couldn't find any documentation about it, but my guess is that when the EvtChar
is received on the port, the event is signaled and no more data is buffered until that event is cleared. At low data rates the event is cleared before another byte is received, so nobody ever noticed the problem before. At high data rates, potentially thousands of bytes may be received during this period. And if this is actually an aspect of the serial port driver, the behavior may be hard to reproduce on other systems.
Now the problem is how to disable this behavior. The SerialStream
class always sets the EvtChar
to 0x1A
, but that doesn't matter because changing it to a different byte just moves the problem rather than fixing it. I believe the actual problem is caused by calling SetCommMask with the EV_RXFLAG
bit (0x0002
) set. Unfortunately this flag is always set regardless of whether you are listening on the event, meaning that the driver will always need to signal it.
I suspect that you can solve your problem by calling SetCommMask
with that bit cleared. The default for SerialStream
is 0x1fb
(all bits but EV_TXEMPTY
). Since EV_RXFLAG
is 0x002
, you can pass in 0x1F9
to clear it.
The P/Invoke signature for SetCommMask
is:
using Microsoft.Win32.SafeHandles;
using System.Runtime.InteropServices;
[DllImport("Kernel32.dll", SetLastError=true, CharSet=CharSet.Auto)]
static extern bool SetCommMask(
SafeFileHandle hFile,
int dwEvtMask
);
To get the hFile
you will have to use Reflection to get the _handle
field of sp1_FT232R.BaseStream
:
var _handle = (SafeFileHandle)sp1_FT232R.BaseStream.GetType()
.GetField("_handle", BindingFlags.NonPublic | BindingFlags.Instance)
.GetValue(sp1_FT232R.BaseStream);
SetCommMask(_handle, 0x1F9);