Question

I'm using Observable.Interval to test how well a particular piece of client/server code performs at different loads.

But it seems to have some odd behaviour.

  • Observable.Interval(timespan = 0) produces events as quickly as possible, e.g. 8 million events per second. This seems ok.
  • Observable.Interval(0 < timespan < 1ms) produces only 1 event, and then nothing.
  • Observable.Interval(1ms <= timespan) produces events at approximately the requested rate, quite heavily quantised, and up to a maximum of only 64 events / second.

I can appreciate it's not necessarily using high res timers underneath, but what is confusing is that it has such completely different behaviour in the three regions.

Is this expected behaviour, or am I using it wrong? If it is expected, then is there an alternative to Observable.Interval for simulating high frequency event sources in Rx, or should I just roll my own...?

A short program that demonstrates the behaviour is below:

static void Main(string[] args)
{
    const int millisecsPerTest = 10000;

    var intervals = new[]
    {
        TimeSpan.FromTicks(0),          // 0 -> rate of 8M messages per second
        TimeSpan.FromTicks(1000),       // 0.1ms -> rate of 0
        TimeSpan.FromTicks(20000),      // 2ms -> rate of 64 messages per second (not 500 as expected)
        TimeSpan.FromTicks(1000000),    // 100ms -> rate of 9 messages per second
    };

    foreach(var interval in intervals)
    {
        long msgs = 0;
        using (Observable.Interval(interval).Subscribe(
            l => { ++msgs; },
            e => Console.WriteLine("Error {0}", e.Message),
            () => Console.WriteLine("Completed")))
        {
            Thread.Sleep(millisecsPerTest);
        }

        Console.WriteLine("Interval: {0} ticks, Events: {1}, Rate: {2} events per second", interval.Ticks, msgs, (int)(msgs/(double)millisecsPerTest*1000));
    }
}
Was it helpful?

Solution

Yes I think the clock used by .net framework timers only operates around 16ms intervals so your results do not surprise me. Though your 1 tick interval test sounds like a bug. What OS, Rx version, .Net versions are you using? I'll see if I can repro the problem.

For the 0 tick case, I think you are getting high throughput because the Rx scheduler is detecting the work is due "now" and is just launching it immediately and bypassing the .Net timers to schedule the work.

It is fairly easy to use Observable.Create to create your own version of Interval which uses a higher resolution timer. A bit more complex but ultimately more useful would be to write a new IScheduler implemention that used a high resolution timer. Then you could pass that scheduler to all of the existing time-related Rx methods.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top