Question

Main aspect of the question: It's about iOS. Can I somehow dispatch code blocks in a way, that they will all (a) run in background and (b) on the same thread? I want to run some time-consuming operations in background, but these have to be run on the same thread, because they involve resources, that mustn't be shared among threads.

Further technical details, if required: It's about implementing an sqlite plugin for Apache Cordova, a framework for HTML5 apps on mobile platforms. This plugin should be an implementation of WebSQL in the means of the Cordova's plugin API. (That means, it's not possible to wrap entire transactions within single blocks, what could make everything easier.)

Here is some code from Cordova's Docs:

- (void)myPluginMethod:(CDVInvokedUrlCommand*)command
{
    // Check command.arguments here.
    [self.commandDelegate runInBackground:^{
        NSString* payload = nil;
        // Some blocking logic...
        CDVPluginResult* pluginResult = [CDVPluginResult resultWithStatus:CDVCommandStatus_OK messageAsString:payload];
        // The sendPluginResult method is thread-safe.
        [self.commandDelegate sendPluginResult:pluginResult callbackId:command.callbackId];
    }];
}

But as far as I know, there is no guarantee, that those dispatched code blocks (see runInBackground) will run on the same thread.

Was it helpful?

Solution

GCD makes no guarantee that two blocks run on the same thread, even if they belong to the same queue (with the exception of the main queue, of course). However, if you're using a serial queue (DISPATCH_QUEUE_SERIAL) this isn't a problem as you then know that there is no concurrent access to your data.

The man page for dispatch_queue_create says:

Queues are not bound to any specific thread of execution and blocks submitted to independent queues may execute concurrently.

I'm not aware of any way to bind a queue to a specific thread (after all, not needing to care about threads is a major point of GCD). The reason why you can use a serial queue without worrying about the actual thread is this promise:

All memory writes performed by a block dispatched to a serial queue are guaranteed to be visible to subsequent blocks dispatched to the same queue.

That is, a memory barrier seems to be used.

When dealing with threading issues, your main concern is usually to avoid that two threads access something concurrently. If you're using a serial queue you do not have this problem. It usually doesn't really matter which thread is accessing your resources. For example, we're using serial queues to manage Core Data access without a problem.

Edit:

It seems you really found a rare case where you need to be working on the same thread. You could implement your own worker thread:

  • Prerequisites:
    • A NSMutableArray (let's call it blockQueue).
    • A NSCondition (let's call it queueCondition).
  • Create a new NSThread.
    • The thread's method has an endless loop in which it locks the condition, waits for it if the queue is empty (and the "quit" bool is false), dequeues a block and executes it.
  • A method that locks the condition an enqueues the block.

Due to the condition, the thread will simply sleep while there's no work to do.

So, roughly (untested, assuming ARC):

- (void)startWorkerThread
{
    workerThread = [[NSThread alloc]
        initWithTarget:self
        selector:@selector(threadMain)
        object:nil
    ];
    [workerThread start];
}

- (void)threadMain
{
    void (^block)();
    NSThread *currentThread;

    currentThread = [NSThread currentThread];

    while (1) {
        [queueCondition lock];
        {
            while ([blockQueue count] == 0 && ![currentThread isCancelled]) {
                [queueCondition wait];
            }

            if ([currentThread isCancelled]) {
                [queueCondition unlock];
                return;
            }

            block = [blockQueue objectAtIndex:0];
            [blockQueue removeObjectAtIndex:0];
        }
        [queueCondition unlock];

        // Execute block outside the condition, since it's also a lock!
        // We want to give other threads the possibility to enqueue
        // a new block while we're executing a block.
        block();
    }
}

- (void)enqueue:(void(^)())block
{
    [queueCondition lock];
    {
        // Copy the block! IIRC you'll get strange things or
        // even crashes if you don't.
        [blockQueue addObject:[block copy]];
        [queueCondition signal];
    }
    [queueCondition unlock];
}

- (void)stopThread
{
    [queueCondition lock];
    {
        [workerThread cancel];
        [queueCondition signal];
    }
    [queueCondition unlock];
}

Untested Swift 5 port:

var workerThread: Thread?
var blockQueue = [() -> Void]()
let queueCondition = NSCondition()

func startWorkerThread() {
    workerThread = Thread() {
        let currentThread = Thread.current
        while true {
            self.queueCondition.lock()
            while self.blockQueue.isEmpty && !currentThread.isCancelled {
                self.queueCondition.wait()
            }

            if currentThread.isCancelled {
                self.queueCondition.unlock()
                return
            }

            let block = self.blockQueue.remove(at: 0)
            self.queueCondition.unlock()

            // Execute block outside the condition, since it's also a lock!
            // We want to give other threads the possibility to enqueue
            // a new block while we're executing a block.
            block()
        }
    }
    workerThread?.start()
}

func enqueue(_ block: @escaping () -> Void) {
    queueCondition.lock()
    blockQueue.append(block)
    queueCondition.signal()
    queueCondition.unlock()
}

func stopThread() {
    queueCondition.lock()
    workerThread?.cancel()
    queueCondition.signal()
    queueCondition.unlock()
}

OTHER TIPS

In GCD: no, that's not possible with the current lib dispatch.

Blocks can be executed by dispatch lib on whatever thread which is available, no matter to which queue they have been dispatched.

One exception is the main queue, which always executes its blocks on the main thread.

Please file a feature request to Apple, since it seems justified and sound. But I fear it's not feasible, otherwise it would already exist ;)

You can use NSOperationQueue. You can make it use just one thread by using method - (void)setMaxConcurrentOperationCount:(NSInteger)count. Set it to 1

Create a serial dispatch queue, and dispatch all the calls to that serial dispatch queue. All the calls will be performed in the background, but sequentially on the same thread.

If you want to perform a selector in Main Thread, you can use

- (void)performSelectorOnMainThread:(SEL)aSelector withObject:(id)arg waitUntilDone:(BOOL)wait

and if you want to it to perform in background thread

- (void)performSelectorInBackground:(SEL)aSelector withObject:(id)object

and if you want to perform in any other thread use GCD(Grand Central Dispatch)

double delayInSeconds = 2.0;
    dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t)(delayInSeconds * NSEC_PER_SEC));
    dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
        //code to be executed on the main queue after delay
    });

You can either use NSOperationQueue with a MaxConcurrentOperationCount of 1 or go the manual way down the road by using NSThread model (rather than Grand Central Dispatch). Using the latter I would recommend you to implement a worker-method which runs in a thread and pulls work-packages (or commands) from a queue or a pool that is being feed from outside of the thread. Just make sure you use Locks / Mutex / Synchronisation.

Never tried this but this might do the trick. Use separate properties of atomic dispatch queues for each operation.

    @property (strong, atomic) dispatch_queue_t downloadQueue;

Queue/Thread 1 for first operation

    downloadQueue = dispatch_queue_create("operation1", NULL);

etc.

Since atomic is thread safe, downloadQueue should not be accessed by other threads. So it makes sure that there will be only single thread per operation and other threads will not access it.

Just like this,

dispatch_asyn(dispatch_get_current_queue, ^ {
});
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top