Frage

While i realise there are a lot of these out there, i couldn't find something that applies specifically to my case. So i figured i'd take the risk of duplicating to get the answer to my question...

Bit of background:

I have a master object (ActionsMaster) that contains a list of child objects as a property and handle high-level methods. My "Processing Form" is going to be handling the code to apply each action in this list to an environment but since the calls to apply an action can be multi-threaded/support more than one call at a time, i figured i'd be efficient and spawn about 5 or 10 threads at once (think thousands of actions here... i want this to be fast).

So in order to support that, i've added a Status property to my child object type, to say if it's been handed off to a thread for action, or if it's free to use as well as added an object to my ActionsMaster to use for locking purposes.

The code so far:

Here is the code i use to determine if an action is available or not, and this is where my question comes from: Can i return the object inside the lock block, or do i have to account that it'll never reach the end of the lock block if i return there? I need to lock before i run the LINQ statement otherwise 2 threads may get the same result... here's what i have:

    internal ActionFileAction GetNextFreeAction()
    {
        lock (_IsLocked)
        {
            ActionFileAction action = this.Actions.DefaultIfEmpty(null).FirstOrDefault(a => a.Status == null || a.Status == Action_Status.Free);
            if (action != null)
                action.Status = Action_Status.Queued;

            return action;
        }
    }

I would call this inside a background worker like so:

    void bkg_Automatic_DoWork(object sender, DoWorkEventArgs e)
    {
        //check for user cancel...
        if (this.CancelJobs || this.IsAbort)
        {
            e.Cancel = true;
        }
        else
        {
            if (!this.IsFinished)
            {
                ActionFileAction action = this.TheFile.GetNextFreeAction();
                if (action == null)
                {
                    this.IsFinished = true;
                }
                else
                {
                    //PROCESS THE ACTION HERE...
                }
            }
        }
        e.Result = true;
    }

And just to be thorough, here's my RunWorkerCompleted code:

    void bkg_Automatic_RunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
    {
        if (e.Cancelled)
        {
            GenericLogEntry ent = new GenericLogEntry();
            ent = new GenericLogEntry();
            ent.Area = "SYSTEM";
            ent.Action = "USER ABORT";
            ent.Description = this.CancelJobs == true ? "CLOSED FORM" : "PUSHED ABORT BUTTON";
            ent.Stamp = DateTime.Now;
            ent.Status = LogEntryStatus.Fail;
            UpdateLogView(ent);
        }
        else
        {
            if (this.IsFinished)
            {
                //add all processed actions to the deployment results, if not already done.
            }
            else
            {
                //spawn new thread from this one's END... 
                BackgroundWorker bkg1 = new BackgroundWorker();
                bkg1.DoWork += new DoWorkEventHandler(bkg_Automatic_DoWork);
                bkg1.RunWorkerCompleted += new RunWorkerCompletedEventHandler(bkg_Automatic_RunWorkerCompleted);
                bkg1.WorkerSupportsCancellation = true;
                bkg1.RunWorkerAsync();
            }
        }
    }

Is this the proper way to do what i am trying to accomplish?

War es hilfreich?

Lösung

The Task Parallel Library comes with v4 of the .Net framework and will handle all this for you. There's a little overhead in learning how to use it, but ultimately I think you'll prefer learnign a tried, tested and standard library than spending time making it yourself.

Basically you can just hand it tasks to execute and it will deal with queuing them and running them on threads in a pool.

Here is an MSDN tutorial on using the TPL, although the code in it isn't so clear. Here's another article that might be clearer.

Andere Tipps

Offloading work to multiple threads isnt always the answer. When trying to move work to background thread, you should always consider how much cores your machine has. firing up 5-10 threads on a machine with 2 cores will cause a lot of context switches, which may end up degrading the performance of your code.

Now, to the solution. if you're targeting a .NET 4.0 or above, i'd suggest you go with a ConcurrentQueue with a combination of the TPL. and your code can look like this:

private ConcurrentQueue<ActionFileAction> _actionFileQueue = new ConcurrentQueue<ActionFileAction>();
        internal ActionFileAction GetNextFreeAction(CancellationToken cancellationToken = default(CancellationToken))
        {
            // Checks if a cancellation was requested
            cancellationToken.ThrowIfCancellationRequested();
            ActionFileAction actionFileToHandle;
            _actionFileQueue.TryDequeue(out actionFileToHandle);

            //may return null if couldn't dequeue
            return actionFileToHandle;
        }

        //C# Methods should be CamelCase without underscores
        void HandleAllActionFiles()
        {
            try
            {
                var cancellationToken = new CancellationTokenSource().Token;
                Parallel.Invoke(
                    new ParallelOptions
                    {
                        MaxDegreeOfParallelism = Environment.ProcessorCount,
                        CancellationToken = cancellationToken
                    }, () => GetNextFreeAction());
            }
            catch (OperationCanceledException e)
            {
                // Do stuff with cancellation
            }
        }

Of course you still have to handle retrys and listen to the queue constantly to see if an item is waiting, but this could get you started :)

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top