Question

I am trying to use the filters on iOS 7 in a similar fashion to an iphone 5 camera app, in where I get 9 squares with how the filters will look like applied to the picture. (altho the camera does it in real time)

For some unknown reason its not working on an iPhone 4 but it does perfectly on an iPhone 5 and 5s.

The code is very simple, I have a view controller that will be set with an image when initialized, configure the OpenGL filtering on the viewDidLoad and request the drawing on the viewWillAppear:

I am using GLKViews to control the drawing so that I only have to call for "display" to trigger the drawing.

- (void)viewDidLoad
{
    [super viewDidLoad];

    glContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
    context = [CIContext contextWithEAGLContext:glContext];

    self.monoFilterGLKView.context = glContext;
    self.monoFilterGLKView.delegate = self;
    self.monoFilterLabel.text = NSLocalizedString(@"Mono Filter Label", nil);

    self.tonalFilterGLKView.context = glContext;
    self.tonalFilterGLKView.delegate = self;
    self.tonalFilterLabel.text = NSLocalizedString(@"Tonal Filter Label", nil);

    self.noirFilterGLKView.context = glContext;
    self.noirFilterGLKView.delegate = self;
    self.noirFilterLabel.text = NSLocalizedString(@"Noir Filter Label", nil);

    self.fadeFilterGLKView.context = glContext;
    self.fadeFilterGLKView.delegate = self;
    self.fadeFilterLabel.text = NSLocalizedString(@"Fade Filter Label", nil);

    self.noFilterGLKView.context = glContext;
    self.noFilterGLKView.delegate = self;
    self.noFilterLabel.text = NSLocalizedString(@"No Filter Label", nil);

    self.chromeFilterGLKView.context = glContext;
    self.chromeFilterGLKView.delegate = self;
    self.chromeFilterLabel.text = NSLocalizedString(@"Chrome Filter Label", nil);

    self.processFilterGLKView.context = glContext;
    self.processFilterGLKView.delegate = self;
    self.processFilterLabel.text = NSLocalizedString(@"Process Filter Label", nil);

    self.transferFilterGLKView.context = glContext;
    self.transferFilterGLKView.delegate = self;
    self.transferFilterLabel.text = NSLocalizedString(@"Transfer Filter Label", nil);

    self.instantFilterGLKView.context = glContext;
    self.instantFilterGLKView.delegate = self;
    self.instantFilterLabel.text = NSLocalizedString(@"Instant Filter Label", nil);

    selectedFilter = self.noFilterGLKView;
    self.selectedViewBorder.center = selectedFilter.center;

}

- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];

    NSLog(@"Filters View will appear");

    image = [CIImage imageWithCGImage:self.imageToFilter.CGImage];

    [self.monoFilterGLKView display];
    [self.tonalFilterGLKView display];
    [self.noirFilterGLKView display];
    [self.fadeFilterGLKView display];
    [self.noFilterGLKView display];
    [self.chromeFilterGLKView display];
    [self.processFilterGLKView display];
    [self.transferFilterGLKView display];
    [self.instantFilterGLKView display];
}

- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
    NSLog(@"DRAW");

    CGRect doubleRect = CGRectMake(rect.origin.x, rect.origin.y, rect.size.width*2, rect.size.height*2);

    if ([view isEqual:self.monoFilterGLKView]) {
        glClearColor(0.0, 0.0, 0.0, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectMono"];
        [filter setValue:image forKey:kCIInputImageKey];
        CIImage *result = [filter valueForKey:kCIOutputImageKey];
        CGRect extent = [result extent];

        [context drawImage:result inRect:doubleRect fromRect:extent];
    }
    if ([view isEqual:self.tonalFilterGLKView]) {
        glClearColor(0.0, 0.0, 1.0, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectTonal"];
        [filter setValue:image forKey:kCIInputImageKey];
        CIImage *result = [filter valueForKey:kCIOutputImageKey];
        CGRect extent = [result extent];

        [context drawImage:result inRect:doubleRect fromRect:extent];
    }
    if ([view isEqual:self.noirFilterGLKView]) {
        glClearColor(0.0, 1.0, 0.0, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectNoir"];
        [filter setValue:image forKey:kCIInputImageKey];
        CIImage *result = [filter valueForKey:kCIOutputImageKey];
        CGRect extent = [result extent];

        [context drawImage:result inRect:doubleRect fromRect:extent];
    }
    if ([view isEqual:self.fadeFilterGLKView]) {
        glClearColor(0.0, 1.0, 1.0, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectFade"];
        [filter setValue:image forKey:kCIInputImageKey];
        CIImage *result = [filter valueForKey:kCIOutputImageKey];
        CGRect extent = [result extent];

        [context drawImage:result inRect:doubleRect fromRect:extent];
    }
    if ([view isEqual:self.noFilterGLKView]) {
        glClearColor(1.0, 0.0, 0.0, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        [context drawImage:image inRect:doubleRect fromRect:image.extent];
    }
    if ([view isEqual:self.chromeFilterGLKView]) {
        glClearColor(1.0, 0.0, 1.0, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectChrome"];
        [filter setValue:image forKey:kCIInputImageKey];
        CIImage *result = [filter valueForKey:kCIOutputImageKey];
        CGRect extent = [result extent];

        [context drawImage:result inRect:doubleRect fromRect:extent];
    }
    if ([view isEqual:self.processFilterGLKView]) {
        glClearColor(1.0, 1.0, 0.0, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectProcess"];
        [filter setValue:image forKey:kCIInputImageKey];
        CIImage *result = [filter valueForKey:kCIOutputImageKey];
        CGRect extent = [result extent];

        [context drawImage:result inRect:doubleRect fromRect:extent];
    }
    if ([view isEqual:self.transferFilterGLKView]) {
        glClearColor(1.0, 1.0, 1.0, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectTransfer"];
        [filter setValue:image forKey:kCIInputImageKey];
        CIImage *result = [filter valueForKey:kCIOutputImageKey];
        CGRect extent = [result extent];

        [context drawImage:result inRect:doubleRect fromRect:extent];
    }
    if ([view isEqual:self.instantFilterGLKView]) {
        glClearColor(0.5, 0.5, 0.5, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectInstant"];
        [filter setValue:image forKey:kCIInputImageKey];
        CIImage *result = [filter valueForKey:kCIOutputImageKey];
        CGRect extent = [result extent];

        [context drawImage:result inRect:doubleRect fromRect:extent];
    }

}

But as I said before, it wont work on the iPhone 4. (The colored squares are just to distinguish which drawing I am supposed to see, they are simply set by changing the glClearColor color).

I did notice that the camera app on the iPhone does not provide the real time filtering for the camera, but it does show a preview of the filters when directly applying the filter to the image on the camera roll.

Does anyone knows what I am doing wrong?

The alternative if i cannot find the reason for this is to process the filter output on the CPU when running on iPhone 4, but I am not sure how slow will that be.

I also tried disabling the filters and just drawing the unfiltered image directly by just using:

glClearColor(1.0, 0.0, 0.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);

[context drawImage:image inRect:doubleRect fromRect:image.extent];

But nothing appeared either.

When the user selects one of the filters and presses "save" to save it on the device I am filtering the image using the CPU so the image is getting saved perfectly fine for both iPhone 4 and iPhone 5.

PD: Both iPhones have different languages set which is why the text is different.

Filters on iPhone 4

Filters on iPhone 5

Update: The solution I currently have is to resize the image to a thumbnail size and use the CPU to create filtered versions of it, which doesnt take as much as expected. I would still like to know why this cannot be done using an OpenGL context as with the iphone 5.

As explained before, i believe the issue is not in the filters, nor it has anything to do with them. but in rendering directly into an OpenGL context with this call:

[context drawImage:result inRect:doubleRect fromRect:extent];

But I haven't found any documentation explaining why.

Was it helpful?

Solution

Have you verified that the images you are processing in are not larger than what is returned by -[CIContext inputImageMaximumSize]? This limit may vary depending on what hardware you’re running on, and iPhone 4 uses a different GPU from iPhone 5 and iPhone 5s.

OTHER TIPS

Filters are not available in iPhone 4 users that upgrade to iOS 7. Like any other iOS release, Apple left out some features from older hardware. Panorama is another example that is not available in iPhone 4.

So, if they are not available as a feature in the camera app, they cannot be access programmatically by devs too.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top