Question

I have an odd problem with captureStillImageAsynchronouslyFromConnection. If I save the image using jpegStillImageNSDataRepresentation while the video is mirrored, the image in the camera roll is rotated 90 degrees clockwise. However, if it's not mirrored, the orientation is fine. I'll post the code, anyone else have this problem/know of a fix?

Update: Just ran some tests, the heights and widths (640x480) are fine and reflect the device's orientation. When I Take a picture in portrait, it reports UIImageOrientationLeft and when mirrored, UIImageOrientationLeftMirrored.

Update 2: When I view the saved photo in the camera roll, the preview of the image has the right orientation, as does the image when you swipe between photos, but when the photo is fully loaded, it rotates 90 degrees. Could this be a camera roll problem? (I'm on 4.3.3)

- (void) captureImageAndSaveToCameraRoll
{
    AVCaptureConnection *stillImageConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self stillImageOutput] connections]];
    if ([stillImageConnection isVideoOrientationSupported])
        [stillImageConnection setVideoOrientation:[self orientation]];

    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
                                                         completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

                                                             ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
                                                                 if (error) {
                                                                     if ([[self delegate] respondsToSelector:@selector(captureManager:didFailWithError:)]) {
                                                                         [[self delegate] captureManager:self didFailWithError:error];
                                                                     }
                                                                 }
                                                             };

                                                             if (imageDataSampleBuffer != NULL) {
                                                                 NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                                                                 ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];

                                                                 UIImage *image = [[UIImage alloc] initWithData:imageData];
                                                                 [library writeImageToSavedPhotosAlbum:[image CGImage]
                                                                                           orientation:(ALAssetOrientation)[image imageOrientation]
                                                                                       completionBlock:completionBlock];
                                                                 [image release];

                                                                 [library release];
                                                             }
                                                             else
                                                                 completionBlock(nil, error);

                                                             if ([[self delegate] respondsToSelector:@selector(captureManagerStillImageCaptured:)]) {
                                                                 [[self delegate] captureManagerStillImageCaptured:self];
                                                             }
                                                         }];
}
Was it helpful?

Solution

I'm wondering if you the newly created image really has it's orientation set as you are assuming here:

[library writeImageToSavedPhotosAlbum:[image CGImage] orientation: (ALAssetOrientation) [image imageOrientation] completionBlock:completionBlock];

[image imageOrientation] in particular seems potentially the source of the problem, especially with the cast required...

you say you are seeing the image orientation somewhere, is it from [image imageOrientation] just after creating it with:

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];

I'm wondering if you are assuming that this metadata is in the imageData returned from AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: when that in fact only contains the image pixel data itself?

OTHER TIPS

According to some reports, the iPhone's photos application back in 2009 didn't support any of the mirrored orientations at all; it's certainly possible that they partially fixed this but still have bugs in some cases (I know, for example, that UIImageView doesn't handle contentStretch correctly in some cases). This should be easy enough to test: just load the sample images from that blog into the camera roll and see what happens.

Create the category for UIImage described in iOS UIImagePickerController result image orientation after upload

Call this method on the image before saving it to your library.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top