On Android, it is possible to make the camera write its output directly to an OpenGL texture (of type GL_TEXTURE_EXTERNAL_OES), avoiding buffers on the CPU altogether.

Is such a thing possible on iOS?

有帮助吗?

解决方案

The output you get from the camera in iOS is a CMSampleBufferRef, with a CVPixelBufferRef inside. (See documentation here). iOS from version 5 has CVOpenGLESTextureCache in the CoreVideo framework, which allows you to create an OpenGL ES texture using a CVPixelBufferRef, avoiding any copies.

Check the RosyWriter sample in Apple's developer website, it's all there.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top