“CGBitmapContextCreate: invalid data bytes/row” … why is camera+ filtering messing up my image cropping routine?

StackOverflow https://stackoverflow.com/questions/11907666

Pregunta

Original image: enter image description here Filtered image: enter image description here

I am trying to crop UIImages (photos in a phone's camera roll) into squares. Here is part of the code I am using, where 'image' is the image that is being cropped:

if( image.size.height > image.size.width )
{
    dimension = image.size.width;
    imageRef = CGImageCreateWithImageInRect([image CGImage], CGRectMake((image.size.height-dimension)/2, 0, dimension, dimension));

If I am using the original image, it looks like this at this point: enter image description here

Which is fine and what I expect - I have a rotation algorithm not shown here that sorts this out.

If I'm using the filtered image, it looks like this: enter image description here

...not square cropped, but weirdly zoomed in instead. So this seems to be where the problem lies, and I don't know why these filtered images are behaving differently.

}
else
{
    dimension = image.size.height;
    imageRef = CGImageCreateWithImageInRect([image CGImage], CGRectMake((image.size.width-dimension)/2, 0, dimension, dimension));
}

CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef);
CGColorSpaceRef colorSpaceInfo = CGImageGetColorSpace(imageRef);
CGContextRef bitmap;

bitmap = CGBitmapContextCreate(NULL, dimension, dimension, CGImageGetBitsPerComponent(imageRef), CGImageGetBytesPerRow(imageRef), colorSpaceInfo, bitmapInfo);

My problem is that at that last line, CGBitmapContextCreate, I sometimes get the following error:

<Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 7744 for 8 integer bits/component, 3 components, kCGImageAlphaNoneSkipLast.

What's odd is that usually, this doesn't happen - so far, I've only encountered this error when the original image has height greater than width, and it has been filtered by another app called Camera+...the exact same photo before filtering causes no problems, and a filtered landscape photo also seems to be fine.

Can anyone guide me here or help me explain what's actually going on? I understand enough from the error message to know that if I replace CGImageGetBytesPerRow(imageRef) with some arbitrary number higher than 7744, the error no longer happens, but I don't understand enough about this CGImage stuff to know what effect that's actually having on anything, and it doesn't seem to me like much of an actual solution. This code is based on other cropping examples I've seen on the web, and so my understanding of these bitmap functions is limited.

Any thoughts would be hugely appreciated!

EDIT

I found this question on SO: In this CGBitmapContextCreate, why is bytesPerRow 0? and it prompted me to try setting the bytesPerRow parameter to 0. Turns out this eliminates the error, but my cropping routine doesn't work properly in the same situations as when this error was occurring before. This might take a special person to answer, but does anyone know enough about image filtering to take a guess at why portrait-oriented, camera+ filtered photos are somehow being treated differently by this code? I've updated the title since the question has changed slightly.

EDIT2

I've added example images into the code above, and in the end, after any necessary rotating, the final cropped images look like this:

with original image: enter image description here - perfect!

with filtered image: enter image description here - terrible!

The code that's used to create these final, supposedly cropped images is this:

CGContextDrawImage(bitmap, CGRectMake(0, 0, dimension, dimension), imageRef);
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
image = [UIImage imageWithCGImage:ref];
UIImageWriteToSavedPhotosAlbum(image, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
¿Fue útil?

Solución

In short - camera+ filtered images, for some reason, are coming in with a different imageOrientation value than the original images. Somehow, this fact causes a line like this:

imageRef = CGImageCreateWithImageInRect([image CGImage], CGRectMake((image.size.height-dimension)/2, 0, dimension, dimension));

to behave differently depending on the imageOrientation of image. So, while the original image, whose image orientation was either right or left, was being rotated onto its side by this line (which is why I was cropping with an x offset in both portrait and landscape sizes), the filtered image's orientation is up. Because of this, I wasn't getting the rotation I expected, and so the filtered images were getting stretched. To solve this, I am checking the orientation of the image before calling CGImageCreateWithImageInRect, and if it is portrait-sized but has an up orientation, I crop with a y offset instead of x (like the line of code David H mentions below).

My guess is that calling [image CGImage] rotates the image to a relative up position...so if the orientation is right, the image gets rotated 90 degrees counter-clockwise, but if the orientation is up, it doesn't get rotated at all. I still don't understand why filtered images end up with a different orientation than their originals, but I assume it is just some sort of side effect in camera+'s filtering code. All this orientation stuff could stand to be a lot simpler, but this seems to be the solution for now.

Otros consejos

A few comments:

1) you want to round the hex value of the value when you divide by 2 so as to not get on a fractional pixel boundary (user roundf())

2) you do not handle the case of both dimensions the same

3) in the first create, you are setting the x offset not the y - use this modfied line:

imageRef = CGImageCreateWithImageInRect([image CGImage], CGRectMake(0, (image.size.height-dimension)/2, dimension, dimension));
Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top