문제

I'm wondering if it's possible to have an instance of an AVCaptureSession and UIImagePicker both accessing the camera simultaneously.

I want to create an app that shows an ambient light meter/indicator as an overlay view of a UIImagePicker when the camera is active. I previously implemented this using UIGetScreenImage(), but Apple is now disallowing use of this private API in favor of AVCaptureSession. In my experimentation, AVCaptureSession seems to become suspended when UIImagePicker displays the camera view. Any ideas? Thanks!

도움이 되었습니까?

해결책

You shouldn't be using UIImagePicker for this at all. Instead you should be using an AVCaptureSession.

Here are two tutorials that will help you. The first one shows you how to setup a live camera view and overlay UI elements on top. You can find that here.

The second tutorial shows you how to capture an image from that live camera view. You can find that here.

다른 팁

I don't think so that they can not access the both thing simutaneoulsly,

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top