Question

I need to obtain the UIImage from uncompressed image data from CMSampleBufferRef. I'm using the code:

captureStillImageOutput captureStillImageAsynchronouslyFromConnection:connection
 completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) 
{
    // that famous function from Apple docs found on a lot of websites
    // does NOT work for still images
    UIImage *capturedImage = [self imageFromSampleBuffer:imageSampleBuffer]; 
}

http://developer.apple.com/library/ios/#qa/qa1702/_index.html is a link to imageFromSampleBuffer function.

But it does not work properly. :(

There is a jpegStillImageNSDataRepresentation:imageSampleBuffer method, but it gives the compressed data (well, because JPEG).

How can I get UIImage created with the most raw non-compressed data after capturing Still Image?

Maybe, I should specify some settings to video output? I'm currently using those:

captureStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
captureStillImageOutput.outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };

I've noticed, that output has a default value for AVVideoCodecKey, which is AVVideoCodecJPEG. Can it be avoided in any way, or does it even matter when capturing still image?

I found something there: Raw image data from camera like "645 PRO" , but I need just a UIImage, without using OpenCV or OGLES or other 3rd party.

Was it helpful?

Solution

The method imageFromSampleBuffer does work in fact I'm using a changed version of it, but if I remember correctly you need to set the outputSettings right. I think you need to set the key as kCVPixelBufferPixelFormatTypeKey and the value as kCVPixelFormatType_32BGRA.

So for example:

NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;                                 
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];                
NSDictionary* outputSettings = [NSDictionary dictionaryWithObject:value forKey:key];

[newStillImageOutput setOutputSettings:outputSettings];

EDIT

I am using those settings to take stillImages not video. Is your sessionPreset AVCaptureSessionPresetPhoto? There may be problems with that

AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
[newCaptureSession setSessionPreset:AVCaptureSessionPresetPhoto];

EDIT 2

The part about saving it to UIImage is identical with the one from the documentation. That's the reason I was asking for other origins of the problem, but I guess that was just grasping for straws. There is another way I know of, but that requires OpenCV.

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

CVPixelBufferLockBaseAddress(imageBuffer, 0);

void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);



// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                             bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);


// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);

// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];

// Release the Quartz image
CGImageRelease(quartzImage);

return (image);

}

I guess that is of no help to you, sorry. I don't know enough to think of other origins for your problem.

OTHER TIPS

Here's a more efficient way:

UIImage *image = [UIImage imageWithData:[self imageToBuffer:sampleBuffer]];

- (NSData *) imageToBuffer:(CMSampleBufferRef)source {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    return data;
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top