Pergunta

I am trying to capture screen portion to post image on social media.

I am using following code to capture screen.

- (UIImage *) imageWithView:(UIView *)view
{
    UIGraphicsBeginImageContextWithOptions(view.bounds.size, NO, 0.0);
    [view.layer renderInContext:UIGraphicsGetCurrentContext()];

    UIImage *img = UIGraphicsGetImageFromCurrentImageContext();

    UIGraphicsEndImageContext();
    return img;
}

Above code is perfect for capturing screen.

Problem :

My UIView contains GPUImageView with the filtered image. When I tries to capture screen using above code, that particular portion of GPUImageView does not contains the filtered image.

I am using GPUImageSwirlFilter with the static image (no camera). I have also tried

UIImage *outImage = [swirlFilter imageFromCurrentFramebuffer]

but its not giving image.

Note : Following is working code, which gives perfect output of swirl effect, but I want same image in UIImage object.

 dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{

        GPUImageSwirlFilter *swirlFilter = [GPUImageSwirlFilter alloc] init];
        swirlLevel = 4;
        [swirlFilter setAngle:(float)swirlLevel/10];
        UIImage *inputImage = [UIImage imageNamed:gi.wordImage];
        GPUImagePicture  *swirlSourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage];
        inputImage = nil;
        [swirlSourcePicture addTarget:swirlFilter];
        dispatch_async(dispatch_get_main_queue(), ^{
            [swirlFilter addTarget:imgSwirl];
            [swirlSourcePicture processImage];
            // This works perfect and I have filtered image in my imgSwirl. But I want
            // filtered image in UIImage to use at different place like posting 
            // on social media
            sharingImage = [swirlFilter imageFromCurrentFramebuffer];  // This also 
            // returns nothing.
        });
    });

1) Am I doing something wrong with GPUImage's imageFromCurrentFramebuffer ?

2) And why does screen capture code is not including GPUImageView portion in output image ?

3) How do I get filtered image in UIImage ?

Foi útil?

Solução

First, -renderInContext: won't work with a GPUImageView, because a GPUImageView renders using OpenGL ES. -renderinContext: does not capture from CAEAGLLayers, which are used to back views presenting OpenGL ES content.

Second, you're probably getting a nil image in the latter code because you've forgotten to set -useNextFrameForImageCapture on your filter before triggering -processImage. Without that, your filter won't hang on to its backing framebuffer long enough to capture an image from it. This is due to a recent change in the way that framebuffers are handled in memory (although this change did not seem to get communicated very well).

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top