Question

On Android, it is possible to make the camera write its output directly to an OpenGL texture (of type GL_TEXTURE_EXTERNAL_OES), avoiding buffers on the CPU altogether.

Is such a thing possible on iOS?

Was it helpful?

Solution

The output you get from the camera in iOS is a CMSampleBufferRef, with a CVPixelBufferRef inside. (See documentation here). iOS from version 5 has CVOpenGLESTextureCache in the CoreVideo framework, which allows you to create an OpenGL ES texture using a CVPixelBufferRef, avoiding any copies.

Check the RosyWriter sample in Apple's developer website, it's all there.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top