Question

I am creating an app that convert image to binary image. For that i am using GPUimage framework. First it will convert it to grayscale than change contarst and than conevrt it to Binarize image.

When i use grayscale and contrast filter it will generate memory warning and some time if i try to convert multiple image (lets say 10) at one by one app crashes.

Here is my code :

- (UIImage *) doBinarize:(UIImage *)sourceImage
    {
        UIImage * grayScaledImg = [self grayImage:sourceImage];
        grayScaledImg = [self contrastImage:grayScaledImg];

        GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:grayScaledImg];
        GPUImageAdaptiveThresholdFilter *stillImageFilter = [[GPUImageAdaptiveThresholdFilter alloc] init];
        stillImageFilter.blurRadiusInPixels = 8.0;

        [imageSource addTarget:stillImageFilter];
        [imageSource processImage];
        UIImage *retImage = [stillImageFilter imageByFilteringImage:grayScaledImg];

        UIImage * aretImage = [self sharpenImage:retImage];

        [imageSource removeAllTargets];

        return aretImage;
    }

    - (UIImage *) grayImage :(UIImage *)inputImage
    {
        GPUImageGrayscaleFilter *selectedFilter = [[GPUImageGrayscaleFilter alloc] init];
        UIImage *filteredImage = [selectedFilter imageByFilteringImage:inputImage];
        return filteredImage;
    }

    - (UIImage *) sharpenImage :(UIImage *)inputImage
    {
        GPUImageSharpenFilter *sharpenFilter = [[GPUImageSharpenFilter alloc] init];
        [sharpenFilter setSharpness:10];
        UIImage *quickFilteredImage = [sharpenFilter imageByFilteringImage: inputImage];
        return quickFilteredImage;
    }

    - (UIImage *) contrastImage :(UIImage *)inputImage
    {
        GPUImageContrastFilter *contrastfilter =[[GPUImageContrastFilter alloc]init];
        [contrastfilter setContrast:3];
        UIImage *ima= [contrastfilter imageByFilteringImage:inputImage];
        return ima;
    }

If i close the code of gray and contrast memory warning gone so problem is in that code.

Was it helpful?

Solution

First, you're doing a lot of unnecessary work there. The adaptive threshold filter (along with all other edge detection or thresholding filters) automatically converts its input into greyscale, so there's no need for that.

You shouldn't be converting to and from UIImages, since each pass through one requires expensive Core Graphics access on the CPU. Also, you're going to build up a lot of huge temporary UIImages in memory, which could be the cause of your memory-related crashes if these are accumulated in a loop.

Instead, take your input image and chain it through both your various filters in one pass:

GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:sourceImage];
GPUImageContrastFilter *contrastfilter =[[GPUImageContrastFilter alloc]init];
[contrastfilter setContrast:3];
GPUImageAdaptiveThresholdFilter *stillImageFilter = [[GPUImageAdaptiveThresholdFilter alloc] init];
stillImageFilter.blurRadiusInPixels = 8.0;
GPUImageSharpenFilter *sharpenFilter = [[GPUImageSharpenFilter alloc] init];
[sharpenFilter setSharpness:10];
[imageSource addTarget:contrastFilter];
[contrastFilter addTarget:stillImageFilter];
[stillImageFilter addTarget:sharpenFilter];

[sharpenFilter useNextFrameForImageCapture];
[imageSource processImage];
UIImage *outputImage = [sharpenFilter imageFromCurrentFramebuffer];

This will cause your image to stay on the GPU up until the last step, and with the new framebuffer caching mechanism in the framework, it will limit the memory usage of this processing.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top