Question

I am currently using three of the GPUImage supplied filters (Amatorka, Miss Etikate and Soft Elegance), and would ideally like to an user to be able to apply the filters to their photo as soon as its taken (for example, user could swipe left or right on the photo immediately after to see how their photo looks with the filter applied)

My current problem is that, from start to finish, the three filters takes around more than 1.5 second to complete processing (on an iPhone 5). I have tried to quicken up the process by storing the filters as strong properties and instantiating them in viewDidLoad, but all this results is memory warnings and the app crashes. I was wondering if there is a good workaround to "pre-populating" the filters with the taken image so that they could be quickly applied without having to wait, or if this is not how its meant to be used.

Help is much appreciated. I have pasted below a sample method that I use the process the Amatorka filter:

- (void)processAmatorkaFilter
{
    dispatch_queue_t backgroundQueue = dispatch_queue_create("queue1", 0);

    dispatch_async(backgroundQueue, ^{
        //do filter work
        UIImage *imageShown = self.totalOriginalImage;
        GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:imageShown];
        GPUImageAmatorkaFilter *amoFilter = [[GPUImageAmatorkaFilter alloc] init];

        [stillImageSource addTarget:amoFilter];
        [amoFilter useNextFrameForImageCapture];
        [stillImageSource processImage];

        UIImage *currentFilteredVideoFrame = [amoFilter imageFromCurrentFramebuffer];
        UIImage *revampedImage = [self orientationAdjustment:currentFilteredVideoFrame];

        if (isFrontFacing){
            revampedImage = [UIImage imageWithCGImage:revampedImage.CGImage scale:revampedImage.scale orientation:UIImageOrientationLeftMirrored];
        }

        dispatch_async(dispatch_get_main_queue(), ^{
            [self.filteredImageArray addObject:revampedImage];
            NSLog(@"\n\nDone Amatorka\n\n");
        });
    });
}

And this is how I allow the filter to be applied:

- (void)handleLeftSwipe:(UIGestureRecognizer*)recognizer {
    NSLog(@"Swiped left");

    if (rotatingNumber == 3){ //3 filters and 1 original image in total
        rotatingNumber = 0;
    } else {
        rotatingNumber++;
    }

    UIImage *swipedImage = [self.filteredImageArray objectAtIndex:rotatingNumber];
    self.imageView.image = swipedImage;

}
Was it helpful?

Solution

The Amatorka, Miss Etikate and Soft Elegance filters are all subclasses of GPUImageLookupFilter. The subclassed lookup filters are a little different from others, in that they use an internal instance of GPUImagePicture to pull in the lookup table used for those filters. Initializing and uploading these lookup images can take a little longer on the first instantiation of a lookup like this.

One way you can accelerate this process without having to hang on to filters (although you should be able to do that without the memory consequences you describe, using my latest cached framebuffer optimizations) is to manually replicate these lookups.

If you look inside those lookup filter subclasses, you'll see the images they use for the lookups ("lookup_miss_etikate.png", etc.). Manually create a GPUImagePicture instance from each of these images and hold on to that. When you need to create a lookup filter of a particular type, simply add the lookup image at the second input position for the lookup filter:

[lookupImage addTarget:lookupFilter atTextureLocation:1];

and you will have recreated that particular lookup filter subclass. It will behave just as the subclass would, only you avoid having to create and upload the lookup image each time.

When done, remove the lookup filter as the target for your lookup image and dispose of the filter as needed.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top