Question

I am using OpenGL on iOS 7 to render the front camera video capture to a UIView on the iPhone display ( same iphone 5). I am using AVCaptureSessionPreset640x480 and passing this to AVCaptureSession method

[captureSession setSessionPreset:AVCaptureSessionPreset640x480];

However the rendered video seems to be a lower resolution than set as above, it seems to be AVCaptureSessionPreset352x288. In fact whatever constant I pass from these makes no difference the resolution is the same

NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
NSString *const AVCaptureSessionPresetInputPriority;

How can I check what resolution the camera is actually capturing at?

Thanks

Was it helpful?

Solution

Read the dimensions of the buffer being captured, as follows (for AVCaptureSessionPresetPhoto you need to capture a still image, of course, rather than reading the video frame...):

- (void) captureOutput:(AVCaptureOutput *)captureOutput
       didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
              fromConnection:(AVCaptureConnection *)connection
  {
    CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // "width" and "height" now hold your dimensions...

  }
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top