Question

I'm trying to apply a blend filters to 2 images.
I've recently updated GPUImage to the last version. To make things simple I've modified the example SimpleImageFilter.
Here is the code:

  UIImage * image1 = [UIImage imageNamed:@"PGSImage_0000.jpg"];
    UIImage * image2 = [UIImage imageNamed:@"PGSImage_0001.jpg"];
    twoinputFilter = [[GPUImageColorBurnBlendFilter alloc] init];
    sourcePicture1 = [[GPUImagePicture alloc] initWithImage:image1 ];
    sourcePicture2 = [[GPUImagePicture alloc] initWithImage:image2 ];
    [sourcePicture1 addTarget:twoinputFilter];
    [sourcePicture1 processImage];
    [sourcePicture2 addTarget:twoinputFilter];
    [sourcePicture2 processImage];
    UIImage * image = [twoinputFilter imageFromCurrentFramebuffer];

The image returned is nil.Applying some breakpoints I can see that the filter fails inside the method - (CGImageRef)newCGImageFromCurrentlyProcessedOutput the problem is that the framebufferForOutput is nil.I'm using simulator.
I don't get why it isn't working.

Was it helpful?

Solution

It seems that I was missing this command, as written in the documentation for still image processing:

Note that for a manual capture of an image from a filter, you need to set -useNextFrameForImageCapture in order to tell the filter that you'll be needing to capture from it later. By default, GPUImage reuses framebuffers within filters to conserve memory, so if you need to hold on to a filter's framebuffer for manual image capture, you need to let it know ahead of time.

[twoinputFilter useNextFrameForImageCapture];
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top