Pregunta

I am setting the .contents of a CALayer to a CGImage, derived from drawing into an NSBitMapImageRep.

As far as I understand from the docs and WWDC videos, setting the layer's .contentsCenter to an NSRect like {{0.5, 0.5}, {0, 0}}, in combination with a .contentsGravity of kCAGravityResize should lead to Core Animation resizing the layer by stretching the middle pixel, the top and bottom horizontally, and the sides vertically.

This very nearly works, but not quite. The layer resizes more-or-less correctly, but if I draw lines at the edge of the bitmap, as I resize the window the lines can be seen to fluctuate in thickness very slightly. It's subtle enough to be barely a problem until the resizing gets down to around 1/4 of the original layer's size, below which point the lines can thin and disappear altogether. If I draw the bitmaps multiple times at different sizes, small differences in line thickness are very apparent.

I originally canvassed a pixel-alignment issue, but it can't be that because the thickness of the stationary LH edge (for example) will fluctuate as I resize the RH edge. It happens on 1x and 2x screens.


Here's some test code. It's the updateLayer method from a layer-backed NSView subclass (I'm using the alternative non-DrawRect draw path):

- (void)updateLayer {

    id image = [self imageForCurrentScaleFactor]; // CGImage 

    self.layer.contents = image;
    // self.backingScaleFactor is set from  the window's backingScaleFactor
    self.layer.contentsScale = self.backingScaleFactor; 
    self.layer.contentsCenter = NSMakeRect(0.5, 0.5, 0, 0);
    self.layer.contentsGravity = kCAGravityResize;

}

And here's some test drawing code (creating the image supplied by imageForCurrentScaleFactor above):

    CGFloat width = rect.size.width;
    CGFloat height = rect.size.height;
    NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes: NULL
                                                                         pixelsWide: width * scaleFactor
                                                                         pixelsHigh: height * scaleFactor
                                                                      bitsPerSample: 8
                                                                    samplesPerPixel: 4
                                                                           hasAlpha: YES
                                                                           isPlanar: NO
                                                                     colorSpaceName: NSCalibratedRGBColorSpace
                                                                        bytesPerRow: 0
                                                                       bitsPerPixel: 0];

    [imageRep setSize:rect.size];

    [NSGraphicsContext saveGraphicsState];

    NSGraphicsContext *ctx = [NSGraphicsContext graphicsContextWithBitmapImageRep:imageRep];
    [NSGraphicsContext setCurrentContext:ctx];

    [[NSColor whiteColor] setFill];
    [NSBezierPath fillRect:rect];

    [[NSColor blackColor] setStroke];
    [NSBezierPath setDefaultLineWidth:1.0f];
    [NSBezierPath strokeRect:insetRect];

    [NSGraphicsContext restoreGraphicsState];

    // image for CALayer.contents is now [imageRep CGImage]
¿Fue útil?

Solución 2

I have found a practical answer, but would be interested in comments filling in detail from anyone who knows how this works.

The problem did prove to be to do with how the CALayer was being stretched. I was drawing into a bitmap of arbitrary size, on the basis that (as the CALayer docs suggest) use of a .contentsCenter with zero width and height would in effect do a nine-part-image stretch, selecting the single centre pixel as the central stretching portion. With this bitmap as a layer's .contents, I could then resize the CALayer to any desired size (down or up).

Turns out that the 'artibrary size' was the problem. Something odd happens in the way CALayer stretches the edge portions (at least when resizing down). By instead making the initial frame for drawing tiny (ie. just big enough to fit my outline drawing plus a couple of pixels for the central stretching portion), nothing spurious makes its way into the edges during stretching.

The bitmap stretches properly if created with rect just big enough to fit the contents and stretchable center pixel, ie.:

NSRect rect = NSMakeRect(0, 0, lineWidth * 2 + 2, lineWidth * 2 + 2);

This tiny image stretches to any larger size perfectly.

Otros consejos

The solution (if you're talking about the problem I think you're talking about) is to have a margin of transparent pixels forming the outside edges of the image. One pixel thick, all the way around, will do it. The reason is that the problem (if it's the problem I think it is) arises only with visible pixels that touch the outside edge of the image. Therefore the idea is to have no visible pixels touch the outside edge of the image.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top