Question

In my app, I use drawViewHierarchyInRect:afterScreenUpdates: in order to obtain a blurred image of my view (using Apple’s UIImage category UIImageEffects).

My code looks like this:

UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 0);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:YES];
UIImage *im = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
/* Use im */

I noticed during development that many of my animations were delayed after using my app for a bit, i.e., my views were beginning their animations after a noticeable (but less than about a second) pause compared to a fresh launch of the app.

After some debugging, I noticed that the mere act of using drawViewHierarchyInRect:afterScreenUpdates: with screen updates set to YES caused this delay. If this message was never sent during a session of usage, the delay never appeared. Using NO for the screen updates parameter also made the delay disappear.

The strange thing is that this blurring code is completely unrelated (as far as I can tell) to the delayed animations. The animations in question do not use drawViewHierarchyInRect:afterScreenUpdates:, they are CAKeyframeAnimation animations. The mere act of sending this message (with screen updates set to YES) seems to have globally affected animations in my app.

What’s going on?

(I have created videos illustrating the effect: with and without an animation delay. Note the delay in the appearance of the "Check!" speech bubble in the navigation bar.)

UPDATE

I have created an example project to illustrate this potential bug. https://github.com/timarnold/AnimationBugExample

UPDATE No. 2

I received a response from Apple verifying that this is a bug. See answer below.

Was it helpful?

Solution

I used one of my Apple developer support tickets to ask Apple about my issue.

It turns out it is a confirmed bug (radar number 17851775). Their hypothesis for what is happening is below:

The method drawViewHierarchyInRect:afterScreenUpdates: performs its operations on the GPU as much as possible, and much of this work will probably happen outside of your app’s address space in another process. Passing YES as the afterScreenUpdates: parameter to drawViewHierarchyInRect:afterScreenUpdates: will cause a Core Animation to flush all of its buffers in your task and in the rendering task. As you may imagine, there’s a lot of other internal stuff that goes on in these cases too. Engineering theorizes that it may very well be a bug in this machinery related to the effect you are seeing.

In comparison, the method renderInContext: performs its operations inside of your app’s address space and does not use the GPU based process for performing the work. For the most part, this is a different code path and if it is working for you, then that is a suitable workaround. This route is not as efficient as it does not use the GPU based task. Also, it is not as accurate for screen captures as it may exclude blurs and other Core Animation features that are managed by the GPU task.

And they also provided a workaround. They suggested that instead of:

UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 0);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:YES];
UIImage *im = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
/* Use im */

I should do this

UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *im = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
/* Use im */

Hopefully this is helpful for someone!

OTHER TIPS

I tried all the latest snapshot methods using swift. Other methods didn't work for me in the background. But taking snapshot this way worked for me.

create an extension with parameters view layer and view bounds.

extension UIView {
    func asImage(viewLayer: CALayer, viewBounds: CGRect) -> UIImage {
        if #available(iOS 10.0, *) {
            let renderer = UIGraphicsImageRenderer(bounds: viewBounds)
            return renderer.image { rendererContext in
                viewLayer.render(in: rendererContext.cgContext)
            }
        } else {
            UIGraphicsBeginImageContext(viewBounds.size)
            viewLayer.render(in:UIGraphicsGetCurrentContext()!)
            let image = UIGraphicsGetImageFromCurrentImageContext()
            UIGraphicsEndImageContext()
            return UIImage(cgImage: image!.cgImage!)
        }
    }
}

Usage

DispatchQueue.main.async {
                let layer = self.selectedView.layer
                let bounds = self.selectedView.bounds
                DispatchQueue.global(qos: .background).async {
                    let image = self.selectedView.asImage(viewLayer: layer, viewBounds: bounds)
                }
            }

We need to calculate layer and bounds in the main thread, then other operations will work in the background thread. It will give smooth user experience without any lag or interruption in UI.

Why do you have this line (from your sample app):

animation.beginTime = CACurrentMediaTime();

Just remove it, and everything will be as you want it to be.

By setting animation time explicitly to CACurrentMediaTime() you ignore possible time transformations that can be present in layer tree. Either don't set it at all (by default animations will start now) or use time conversion method:

animation.beginTime = [view.layer convert​Time:CACurrentMediaTime() from​Layer:nil];

UIKit adds time transformations to layer tree when you call afterScreenUpdates:YES to prevent jumps in ongoing animation, that would be caused otherwise by intermediate CoreAnimation commits. If you want to start animation at specific time (not now), use time conversion method mentioned above.

And while at it, strongly prefer using -[UIView snapshotViewAfterScreenUpdates:] and friends instead of -[UIView drawViewHierarchyInRect:] family (preferably specifying NO for afterScreenUpdates part). In most of the cases you don't really need a persistent image and view snapshot is what you actually want. Using view snapshot instead of rendered image has following benefits:

  • 2x-10x faster
  • Uses 2x-3x less memory
  • It will always use correct colorspace and buffer format (e.g. on devices with wide color screen)
  • It will use correct scale and orientation, so you don't need to think how to position your image so it looks good.
  • It works with accessibility features better (e.g. with Smart Invert colors)
  • View snapshot will also capture out-of-process and secure views correctly (while drawViewHierarchyInRect will render them black or white).

When the afterScreenUpdates parameter is set to YES, the system has to wait until all pending screen updates have happened before it can render the view.

If you're kicking off animations at the same time then perhaps the rendering and the animations are trying to happen together and this is causing a delay.

It may be worth experimenting with kicking off your animations slightly later to prevent this. Obviously not too much later because that would defeat the object, but a small dispatch_after interval would be worth trying.

Have you tried running your code on a background thread? Heres an example using gcd:

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
    //background thread
    UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 0);
    [self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:YES];
    UIImage *im = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    dispatch_sync(dispatch_get_main_queue(), ^(void) {
        //update ui on main thread
    });
});
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top