Domanda

I've written an iOS app in which I'm using CGLayer quite successfully. While researching ways to squeeze a bit more performance out of this app, I came across this blog post: http://iosptl.com/posts/cglayer-no-longer-recommended/ in which the author very broadly states that CGLayer is to never be used. An individual post alone is not cause for concern, but I've also found people referring to this post as something to abide by.

No real specifics are offered. For instance, the author states that "sometimes it's faster, sometimes it's slower". This makes me wonder if the concern is that, in general, programmers will not use this object correctly.

I suppose this question is for the seasoned Cocoa/Cocoa Touch developers. Is there any merit to this? Is CGLayer indeed something to avoid and if so, are there specific, measurable reasons as to why?

È stato utile?

Soluzione

To start with my answer, I would like to conclude that its totally your own design decision that whether you would use CGLayer in your app or not.

The real thing is that if you are drawing something on-screen, it will possibly buy you nothing on iOS platform. On iOS, the basic screen composition block is a CALayer. CALayer takes a Quartz(CG) graphics context to draw that on screen and that might be a context created by CGLayer itself. Now, CALayer being hardware accelerated by itself, would try to cache any graphics content to graphics card and reuse them. And that's the purpose what we had used a CGLayer previously.

Also, if offscreen rendering is concerned, a CALayer can do that when shouldRasterize is set to YES and under some other circumstances. However, keep in mind that offscreen composition is again another task performed by CPU before handing over the rendered content to GPU. So again there is no clear winner.

CGLayer would be particularly handy when creating a CG context that wont be drawn on screen, like a PDF context.

I'm not sure why Apple development team has asked one to avoid CGLayer completely. Might be there are some underlying architectural flaw but that is undocumented, till date. However, until we are sure about that and we have existing apps designed over CGLayer architecture, I dont find any specific reason to completely abandon that.

Altri suggerimenti

For me the most relevant point is what the author wrote in one of his follow up comments:

As I understand it from the Core Graphics team, they basically haven’t touched it [CGLayer] since before the iPhone came out. It was one of those things that sounded really good, but didn’t work out in practice. But it’s not actually broken, so there’s no reason to deprecate it. And as I mentioned, if you have awesome CGLayer code, I don’t see any reason to replace it. CGLayer isn’t bad. It’s just not maintained like other parts of CG.

It would be helpful if Apple's Quartz 2D Programming Guide (updated 2014) didn't contain the following prominent comment box:

Note: Bitmap graphics contexts are sometimes used for drawing offscreen. Before you decide to use a bitmap graphics context for this purpose, see Core Graphics Layer Drawing. CGLayer objects (CGLayerRef) are optimized for offscreen drawing because, whenever possible, Quartz caches layers on the video card.

No. Ignore the "Never" cited in the blog, unless you never profile the impact CGLayer has on your app.

Consider CGLayers as a potential optimization for your program. CGLayers have the potential to affect your program's performance and resource consumption in positive and negative ways (a tradeoff in many cases). In the abstract, it's much like a cache (which have their own costs). Alternative caching mechanisms have their own associated costs, and CGLayer may or may not be the best caching implementation for your program.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top