Pregunta

I have a drawing app and I would like for my users to be able to use particle effects as part of their drawing. Basically, the point of the app is to perform custom drawing and save to Camera Roll or share over the World Wide Web.

I encounted the CAEmitterLayer class recently, which I reckon would be a simple and effective way to add particle effects.

I have been able to draw the particles onscreen in the app using the CAEmitterLayer implementation. So rendering onscreen works fine.

When I go about rendering the contents of the drawing using

CGContextRef context = UIGraphicsBeginImageContextWithSize(self.bounds.size);

// The instance drawingView has a CAEmitterLayer instance in its layer/view hierarchy
[drawingView.layer renderInContext:context];


//Note: I have also tried using the layer.presentationLayer and still nada

....
//Get the image from the current image context here for saving to Camera Roll or sharing


....the particles are never rendered in the image.

What I think is happening

The CAEmitterLayer is in a constant state of "animating" the particles. That's why when I attempt to render the layer (I have also tried render the layers.presentationLayer and modelLayer), the animations are never committed and so the off screen image render does not contain the particles.

Question Has anyone rendered the contents of a CAEmitterLayer offscreen? If so, how did you do it?

Alternate Question Does anyone know of any particle effect system libraries that don't use OpenGL and is not Cocos2D?

¿Fue útil?

Solución

-[CALayer renderInContext:] is useful in a few simple cases, but will not work as expected in more complicated situations. You will need to find some other way to do your drawing.

The documentation for -[CALayer renderInContext:] says:

The Mac OS X v10.5 implementation of this method does not support the entire Core Animation composition model. QCCompositionLayer, CAOpenGLLayer, and QTMovieLayer layers are not rendered. Additionally, layers that use 3D transforms are not rendered, nor are layers that specify backgroundFilters, filters, compositingFilter, or a mask values. Future versions of Mac OS X may add support for rendering these layers and properties.

(These limitations apply to iOS, too.)

The header CALayer.h also says:

 * WARNING: currently this method does not implement the full
 * CoreAnimation composition model, use with caution. */

Otros consejos

I was able to get my CAEmitterLayer rendered as an image correctly in its current animation state with

Swift

func drawViewHierarchyInRect(_ rect: CGRect,
          afterScreenUpdates afterUpdates: Bool) -> Bool



Objective-C

- (BOOL)drawViewHierarchyInRect:(CGRect)rect
             afterScreenUpdates:(BOOL)afterUpdates

within a current context

UIGraphicsBeginImageContextWithOptions(size, false, 0)

and set afterScreenUpdates to true|YES

Good luck with that one :D

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top