Question

I'm wondering if it's possible to have an instance of an AVCaptureSession and UIImagePicker both accessing the camera simultaneously.

I want to create an app that shows an ambient light meter/indicator as an overlay view of a UIImagePicker when the camera is active. I previously implemented this using UIGetScreenImage(), but Apple is now disallowing use of this private API in favor of AVCaptureSession. In my experimentation, AVCaptureSession seems to become suspended when UIImagePicker displays the camera view. Any ideas? Thanks!

Was it helpful?

Solution

You shouldn't be using UIImagePicker for this at all. Instead you should be using an AVCaptureSession.

Here are two tutorials that will help you. The first one shows you how to setup a live camera view and overlay UI elements on top. You can find that here.

The second tutorial shows you how to capture an image from that live camera view. You can find that here.

OTHER TIPS

I don't think so that they can not access the both thing simutaneoulsly,

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top