Question

Hi I want to setup AV capture session to capture images with specific resolution (and, if possible, with specific quality) using iphone camera. here's setupping AV session code

// Create and configure a capture session and start it running
- (void)setupCaptureSession 
{
    NSError *error = nil;

    // Create the session
    self.captureSession = [[AVCaptureSession alloc] init];

    // Configure the session to produce lower resolution video frames, if your 
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    captureSession.sessionPreset = AVCaptureSessionPresetMedium;

    // Find a suitable AVCaptureDevice
    NSArray *cameras=[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    AVCaptureDevice *device;
    if ([UserDefaults camera]==UIImagePickerControllerCameraDeviceFront)
    {
        device =[cameras objectAtIndex:1];
    }
    else
    {
        device = [cameras objectAtIndex:0];
    };

    // Create a device input with the device and add it to the session.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!input)
    {
        NSLog(@"PANIC: no media input");
    }
    [captureSession addInput:input];

    // Create a VideoDataOutput and add it to the session
    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
    [captureSession addOutput:output];
    NSLog(@"connections: %@", output.connections);

    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    // Specify the pixel format
    output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    // If you wish to cap the frame rate to a known value, such as 15 fps, set 
    // minFrameDuration.


    // Assign session to an ivar.
    [self setSession:captureSession];
    [self.captureSession startRunning];
}

and setSession:

-(void)setSession:(AVCaptureSession *)session
{
    NSLog(@"setting session...");
    self.captureSession=session;
    NSLog(@"setting camera view");
    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
    //UIView *aView = self.view;
    CGRect videoRect = CGRectMake(20.0, 20.0, 280.0, 255.0);
    previewLayer.frame = videoRect; // Assume you want the preview layer to fill the view.
    [previewLayer setBackgroundColor:[[UIColor grayColor] CGColor]];
    [self.view.layer addSublayer:previewLayer];
    //[aView.layer addSublayer:previewLayer];
}

and output methods:

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
   fromConnection:(AVCaptureConnection *)connection
{ 
    //NSLog(@"captureOutput: didOutputSampleBufferFromConnection");

    // Create a UIImage from the sample buffer data
    self.currentImage = [self imageFromSampleBuffer:sampleBuffer];

    //< Add your code here that uses the image >
}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{
    //NSLog(@"imageFromSampleBuffer: called");
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);


    // Free up the context and color space
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

Everything is quite standard. But where and what should I change to specify the resolution of captured image and it's quality. Help me please

Was it helpful?

Solution

Refer to Apple's guide Capturing Still Images section regarding which sizes you'll get if you set one or another preset.

The parameter you should change is captureSession.sessionPreset

OTHER TIPS

Try to go with something like this where cx and cy are your custom resolutions:

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                              AVVideoScalingModeResizeAspectFill,AVVideoScalingModeKey,
                           AVVideoCodecH264, AVVideoCodecKey,
                           [NSNumber numberWithInt:cx], AVVideoWidthKey,
                           [NSNumber numberWithInt:cx], AVVideoHeightKey,
                           nil];
_videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top