Question

My goal is to write a custom camera view controller that:

  1. Can take photos in all four interface orientations with both the back and, when available, front camera.
  2. Properly rotates and scales the preview "video" as well as the full resolution photo.
  3. Allows a (simple) effect to be applied to BOTH the preview "video" and full resolution photo.

My previous effort is documented in this question. My latest attempt was to modify Apple's sample GLVideoFrame (from WWDC 2010). However, I have not been able to get the iPhone 4 to display the preview "video" properly when the session preset is AVCaptureSessionPresetPhoto.

Has anyone tried this or know why the example doesn't work with this preset?

Apple's example uses a preset with 640x480 video dimensions and a default texture size of 1280x720. The iPhone 4 back camera delivers only 852x640 when the preset is AVCaptureSessionPresetPhoto.

iOS device camera video/photo dimensions when preset is AVCaptureSessionPresetPhoto:

  • iPhone 4 back: video is 852x640 & photos are 2592x1936
  • iPhone 4 front: video & photos are 640x480
  • iPod Touch 4G back: video & photos are 960x720
  • iPod Touch 4G front: video & photos are 640x480
  • iPhone 3GS: video is 512x384 & photos are 2048x1536

Update

I got the same garbled video result when switching Brad Larson's ColorTracking example (blog post) to use the AVCaptureSessionPresetPhoto.

Was it helpful?

Solution

The issue is that AVCaptureSessionPresetPhoto is now context-aware and runs in different resolutions based on whether you are displaying video or still image captures.

The live preview is different for this mode because it pads the rows with extra bytes. I'm guessing this is some sort of hardware optimization.

In any case, you can see how I solved the problem here:

iOS CVImageBuffer distorted from AVCaptureSessionDataOutput with AVCaptureSessionPresetPhoto

OTHER TIPS

The AVCaptureSessionPresetPhoto is for taking pictures, not capturing live feed. You can read about it here: http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html

(My belief is that this is actually two different cams or sensors, as they behave very differently, and there's a couple of seconds delay just for switching between the Photo and, say, 640x480).

You can't even use both presets at the same time, and switching between them is a headache as well - How to get both the video output and full photo resolution image in AVFoundation Framework

HTH, although not what you wanted to hear...

Oded.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top