سؤال

I have a Worker Role which processes items off a queue. It is basically an infinite loop which pops items off of the queue and asynchronously processes them.

I have two configuration settings (PollingInterval and MessageGetLimit) which I want the worker role to pick up when changed (so with no restart required).

private TimeSpan PollingInterval 
{
    get
    {
        return TimeSpan.FromSeconds(Convert.ToInt32(RoleEnvironment.GetConfigurationSettingValue("PollingIntervalSeconds")));
    }
}

private int MessageGetLimit 
{ 
    get 
    {
        return Convert.ToInt32(RoleEnvironment.GetConfigurationSettingValue("MessageGetLimit"));
    } 
}

public override void Run()
{
    while (true)
    {
        var messages = queue.GetMessages(MessageGetLimit);

        if (messages.Count() > 0)
        {
            ProcessQueueMessages(messages);
        }
        else
        {
            Task.Delay(PollingInterval);
        }
    }
}

Problem:

During peak hours, the while loop could be running a couple of times per second. This means that it would be querying the config items up to 100,000 times per day.

Is this detrimental or inefficient?

هل كانت مفيدة؟

المحلول 2

Upfront disclaimer, I haven't used RoleEnvironments.

The MDSN documentation for GetConfigurationSettingValue states that the configuration is read from disk. http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.serviceruntime.roleenvironment.getconfigurationsettingvalue.aspx. So it is sure to be slow when called often.

The MSDN documentation also shows that there is an event fired when a setting changes. http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.serviceruntime.roleenvironment.changed.aspx. You can use this event to only reload the settings when they have actually changed.

Here is one (untested, not compiled) approach.

private TimeSpan mPollingInterval;
private int mMessageGetLimit;

public override void Run()
{
    // Refresh the configuration members only when they change.
    RoleEnvironment.Changed += RoleEnvironmentChanged;

    // Initialize them for the first time
    RefreshRoleEnvironmentSettings();

    while (true)
    {
        var messages = queue.GetMessages(mMessageGetLimit);

        if (messages.Count() > 0)
        {
            ProcessQueueMessages(messages);
        }
        else
        {
            Task.Delay(mPollingInterval);
        }
    }
}

private void RoleEnvironmentChanged(object sender, RoleEnvironmentChangedEventArgs e)
{
    RefreshRoleEnvironmentSettings();    
}

private void RefreshRoleEnvironmentSettings()
{
    mPollingInterval = TimeSpan.FromSeconds(Convert.ToInt32(RoleEnvironment.GetConfigurationSettingValue("PollingIntervalSeconds")));
    mMessageGetLimit = Convert.ToInt32(RoleEnvironment.GetConfigurationSettingValue("MessageGetLimit"));
}

نصائح أخرى

John's answer is a good one using the Environment Changing/Changed events to modify your settings without restarts, but I think perhaps a better method is for you to use an exponential back-off policy to make your polling more efficient. By having the code behavior smarter on it's own you will reduce how often you are in there tweaking it. Remember that each time you update these environment settings it has to be rolled out to all of the instances, which can take a little time depending on how many instances you have running. Also, you are putting a step in here that a human has to be involved.

You are using Windows Azure Storage Queues which means each time your GetMessages(s) executes it's making a call to the service and retrieving 0 or more messages (up to your MessageGetLimit). Each time it asks for that you'll get charged a transaction. Now, understand that transactions are really cheap. Even 100,000 transactions a day is $0.01/day. However, don't underestimate the speed of a loop. :) You may get more throughput than that and if you have multiple worker role instances this adds up (though will still be a really small amount of money compared to actually running the instances themselves).

A more efficient path would be to put in an exponential backoff approach to reading your messages off the queue. Check out this post by Maarten on a simple example: http://www.developerfusion.com/article/120619/advanced-scenarios-with-windows-azure-queues/. Couple a back off approach with an auto-scaling of the worker roles based on queue depth and you'll have a solution that relies less on a human adjusting settings. Put in minimum and maximum values for instance counts, adjust the numbers of messages to pull based on how many times a message has been present the very next time you ask for one, etc. There are a lot of options here that will reduce your involvement and have an efficient system.

Also, you might look at Windows Azure Service Bus Queues in that they implement long polling, so it results in much fewer transactions while waiting for work to hit the queue.

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top