Domanda

I'm trying to build an eraser tool using Core Graphics, and I'm finding it incredibly difficult to make a performant eraser - it all comes down to:

CGContextSetBlendMode(context, kCGBlendModeClear)

If you google around for how to "erase" with Core Graphics, almost every answer comes back with that snippet. The problem is it only (apparently) works in a bitmap context. If you're trying to implement interactive erasing, I don't see how kCGBlendModeClear helps you - as far as I can tell, you're more or less locked into erasing on and off-screen UIImage/CGImage and drawing that image in the famously non-performant [UIView drawRect].

Here's the best I've been able to do:

-(void)drawRect:(CGRect)rect
{
    if (drawingStroke) {
        if (eraseModeOn) {
            UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0.0);
            CGContextRef context = UIGraphicsGetCurrentContext();
            [eraseImage drawAtPoint:CGPointZero];
            CGContextAddPath(context, currentPath);
            CGContextSetLineCap(context, kCGLineCapRound);
            CGContextSetLineWidth(context, lineWidth);
            CGContextSetBlendMode(context, kCGBlendModeClear);
            CGContextSetLineWidth(context, ERASE_WIDTH);
            CGContextStrokePath(context);
            curImage = UIGraphicsGetImageFromCurrentImageContext();
            UIGraphicsEndImageContext();
            [curImage drawAtPoint:CGPointZero];
        } else {
            [curImage drawAtPoint:CGPointZero];
            CGContextRef context = UIGraphicsGetCurrentContext();
            CGContextAddPath(context, currentPath);
            CGContextSetLineCap(context, kCGLineCapRound);
            CGContextSetLineWidth(context, lineWidth);
            CGContextSetBlendMode(context, kCGBlendModeNormal);
            CGContextSetStrokeColorWithColor(context, lineColor.CGColor);
            CGContextStrokePath(context);
        }
    } else {
        [curImage drawAtPoint:CGPointZero];
    }
}

Drawing a normal line (!eraseModeOn) is acceptably performant; I'm blitting my off-screen drawing buffer (curImage, which contains all previously drawn strokes) to the current CGContext, and I'm rendering the line (path) being currently drawn. It's not perfect, but hey, it works, and it's reasonably performant.

However, because kCGBlendModeNormal apparently does not work outside of a bitmap context, I'm forced to:

  1. Create a bitmap context (UIGraphicsBeginImageContextWithOptions).
  2. Draw my offscreen buffer (eraseImage, which is actually derived from curImage when the eraser tool is turned on - so really pretty much the same as curImage for arguments sake).
  3. Render the "erase line" (path) currently being drawn to the bitmap context (using kCGBlendModeClear to clear pixels).
  4. Extract the entire image into the offscreen buffer (curImage = UIGraphicsGetImageFromCurrentImageContext();)
  5. And then finally blit the offscreen buffer to the view's CGContext

That's horrible, performance-wise. Using Instrument's Time tool, it's painfully obvious where the problems with this method are:

  • UIGraphicsBeginImageContextWithOptions is expensive
  • Drawing the offscreen buffer twice is expensive
  • Extracting the entire image into an offscreen buffer is expensive

So naturally, the code performs horribly on a real iPad.

I'm not really sure what to do here. I've been trying to figure out how to clear pixels in a non-bitmap context, but as far as I can tell, relying on kCGBlendModeClear is a dead-end.

Any thoughts or suggestions? How do other iOS drawing apps handle erase?


Additional Info

I've been playing around with a CGLayer approach, as it does appear that CGContextSetBlendMode(context, kCGBlendModeClear) will work in a CGLayer based on a bit of googling I've done.

However, I'm not super hopeful that this approach will pan out. Drawing the layer in drawRect (even using setNeedsDisplayInRect) is hugely non-performant; Core Graphics is choking up will rendering each path in the layer in CGContextDrawLayerAtPoint (according to Instruments). As far as I can tell, using a bitmap context is definitely preferable here in terms of performance - the only problem, of course, being the above question (kCGBlendModeClear not working after I blit the bitmap context to the main CGContext in drawRect).

È stato utile?

Soluzione

I've managed to get good results by using the following code:

- (void)drawRect:(CGRect)rect
{
    if (drawingStroke) {
        if (eraseModeOn) {
            CGContextRef context = UIGraphicsGetCurrentContext();
            CGContextBeginTransparencyLayer(context, NULL);
            [eraseImage drawAtPoint:CGPointZero];

            CGContextAddPath(context, currentPath);
            CGContextSetLineCap(context, kCGLineCapRound);
            CGContextSetLineWidth(context, ERASE_WIDTH);
            CGContextSetBlendMode(context, kCGBlendModeClear);
            CGContextSetStrokeColorWithColor(context, [[UIColor clearColor] CGColor]);
            CGContextStrokePath(context);
            CGContextEndTransparencyLayer(context);
        } else {
            [curImage drawAtPoint:CGPointZero];
            CGContextRef context = UIGraphicsGetCurrentContext();
            CGContextAddPath(context, currentPath);
            CGContextSetLineCap(context, kCGLineCapRound);
            CGContextSetLineWidth(context, self.lineWidth);
            CGContextSetBlendMode(context, kCGBlendModeNormal);
            CGContextSetStrokeColorWithColor(context, self.lineColor.CGColor);
            CGContextStrokePath(context);
        }
    } else {
        [curImage drawAtPoint:CGPointZero];
    }

    self.empty = NO;
}

The trick was to wrap the following into CGContextBeginTransparencyLayer / CGContextEndTransparencyLayer calls:

  • Blitting the erase background image to the context
  • Drawing the "erase" path on top of the erase background image, using kCGBlendModeClear

Since both the erase background image's pixel data and the erase path are in the same layer, it has the effect of clearing the pixels.

Altri suggerimenti

2D graphics following painting paradigms. When you are painting, it's hard to remove paint you've already put on the canvas, but super easy to add more paint on top. The blend modes with a bitmap context give you a way to do something hard (scrape paint off the canvas) with few lines of code. The few lines of code do not make it an easy computing operation (which is why it performs slowly).

The easiest way to fake clearing out pixels without having to do the offscreen bitmap buffering is to paint the background of your view over the image.

-(void)drawRect:(CGRect)rect
{
    if (drawingStroke) {
        CGColor lineCgColor = lineColor.CGColor;
        if (eraseModeOn) {
            //Use concrete background color to display erasing. You could use the backgroundColor property of the view, or define a color here
            lineCgColor = [[self backgroundColor] CGColor];
        } 
        [curImage drawAtPoint:CGPointZero];
        CGContextRef context = UIGraphicsGetCurrentContext();
        CGContextAddPath(context, currentPath);
        CGContextSetLineCap(context, kCGLineCapRound);
        CGContextSetLineWidth(context, lineWidth);
        CGContextSetBlendMode(context, kCGBlendModeNormal);
        CGContextSetStrokeColorWithColor(context, lineCgColor);
        CGContextStrokePath(context);
    } else {
        [curImage drawAtPoint:CGPointZero];
    }
}

The more difficult (but more correct) way is to do the image editing on a background serial queue in response to an editing event. When you get a new action, you do the bitmap rendering in the background to an image buffer. When the buffered image is ready, you call setNeedsDisplay to allow the view to be redrawn during the next update cycle. This is more correct as drawRect: should be displaying the content of your view as quickly as possible, not processing the editing action.

@interface ImageEditor : UIView

@property (nonatomic, strong) UIImage * imageBuffer;
@property (nonatomic, strong) dispatch_queue_t serialQueue;
@end

@implementation ImageEditor

- (dispatch_queue_t) serialQueue
{
    if (_serialQueue == nil)
    {
        _serialQueue = dispatch_queue_create("com.example.com.imagebuffer", DISPATCH_QUEUE_SERIAL);
    }
    return _serialQueue;
}

- (void)editingAction
{
    dispatch_async(self.serialQueue, ^{
        CGSize bufferSize = [self.imageBuffer size];

        UIGraphicsBeginImageContext(bufferSize);

        CGContext context = UIGraphicsGetCurrentContext();

        CGContextDrawImage(context, CGRectMake(0, 0, bufferSize.width, bufferSize.height), [self.imageBuffer CGImage]);

        //Do editing action, draw a clear line, solid line, etc

        self.imageBuffer = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();

        dispatch_async(dispatch_get_main_queue(), ^{
            [self setNeedsDisplay];
        });
    });
}
-(void)drawRect:(CGRect)rect
{
    [self.imageBuffer drawAtPoint:CGPointZero];
}

@end

key is CGContextBeginTransparencyLayer and use clearColor and set CGContextSetBlendMode(context, kCGBlendModeClear);

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top