You could certainly put that in a sequential file and load it at startup. 50 MB will come off the disk in less than a second. And even if you had to parse it as a text file, you should be able to create the table in another second. 5 million records just isn't that large when you're processing them with a 2 GHz (or faster) processor.
Binary search of the list is O(log n), so the maximum number of probes you'll do per search is 24. That's gonna be pretty darned quick.
It should be easy enough to load test something like this. Just spin it up and then see how long it takes to do, say, 1,000,000 lookups. Something like:
var clock = Stopwatch.StartNew();
for (int i = 0; i < NumIterations; ++i)
{
int val = GetRandomValueToSearchFor(); // however you do that
Ranges.BinarySearch(val, RangeComparer);
}
clock.Stop();
// time per iteration is clock.TotalMilliseconds/NumIterations
That'll let you figure out the absolute fastest you can query the thing. I suspect you'll be fine with thousands of transactions per second.