Domanda

I am using an API which only gives me the integer id of the texture object, and I need to pass that texture's data to AVAssetWriter to create the video.

I know how to create CVOpenGLESTexture object from pixel buffer (CVPixelBufferRef), but in my case I have to somehow copy the data of a texture of which only the id is available.

In other words, I need to copy an opengl texture to my pixelbuffer-based texture object. Is it possible? If yes then how?

In my sample code I have something like:

   void encodeFrame(Gluint textureOb)
    {

    CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferAdaptor pixelBufferPool], &pixelBuffer[0]);

    CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, coreVideoTextureCache, pixelBuffer[0],
                                                          NULL, // texture attributes
                                                          GL_TEXTURE_2D,
                                                          GL_RGBA, // opengl format
                                                          (int)FRAME_WIDTH,
                                                          (int)FRAME_HEIGHT,
                                                          GL_BGRA, // native iOS format
                                                          GL_UNSIGNED_BYTE,
                                                          0,
                                                          &renderTexture[0]);


    CVPixelBufferLockBaseAddress(pixelBuffer[pixelBuffernum], 0);

//Creation of textureOb is not under my control. 
//All I have is the id of texture. 
//Here I need the data of textureOb somehow be appended as a video frame. 
//Either by copying the data to pixelBuffer or somehow pass the textureOb to the Adaptor.

    [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer[0] withPresentationTime:presentationTime];

    }

Thanks for tips and answers.

P.S. glGetTexImage isn't available on iOS.

Update:

@Dr. Larson, I can't set the texture ID for API. Actually I can't dictate 3rd party API to use my own created texture object.

After going through the answers what I understood is that I need to:

1- Attach pixelbuffer-associated texture object as color attachment to a texFBO. For each frame:

2- Bind the texture obtained from API

3- Bind texFBO and call drawElements

What am I doing wrong in this code?

P.S. I'm not familiar with shaders yet, so it is difficult for me to make use of them right now.

Update 2: With the help of Brad Larson's answer and using the correct shaders solved the problem. I had to use shaders which are an essential requirement of Opengl ES 2.0

È stato utile?

Soluzione

For reading back data from OpenGL ES on iOS, you basically have two routes: using glReadPixels(), or using the texture caches (iOS 5.0+ only).

The fact that you just have a texture ID and access to nothing else is a little odd, and limits your choices here. If you have no way of setting what texture to use in this third-party API, you're going to need to re-render that texture to an offscreen framebuffer to extract the pixels for it either using glReadPixels() or the texture caches. To do this, you'd use an FBO sized to the same dimensions as your texture, a simple quad (two triangles making up a rectangle), and a passthrough shader that will just display each texel of your texture in the output framebuffer.

At that point, you can just use glReadPixels() to pull your bytes back into the the internal byte array of your CVPixelBufferRef or preferably use the texture caches to eliminate the need for that read. I describe how to set up the caching for that approach in this answer, as well as how to feed that into an AVAssetWriter. You'll need to set your offscreen FBO to use the CVPixelBufferRef's associated texture as a render target for this to work.

However, if you have the means of setting what ID to use for this rendered texture, you can avoid having to re-render it to grab its pixel values. Set up the texture caching like I describe in the above-linked answer and pass the texture ID for that pixel buffer into the third-party API you're using. It will then render into the texture that's associated with the pixel buffer, and you can record from that directly. This is what I use to accelerate the recording of video from OpenGL ES in my GPUImage framework (with the glReadPixels() approach as a fallback for iOS 4.x).

Altri suggerimenti

Yeah it's rather unfortunate that glGetTexImage isn't ios. I struggled with that when I implemented my CCMutableTexture2D class for cocos2d.

Caching the image before pushing to the gpu

If you take a look into the source you'll notice that in the end I kept the pixel buffer of the image cached into my CCMutableTexture2D class instead of the normal route of discarding it after it's pushed to the gpu.

http://www.cocos2d-iphone.org/forum/topic/2449

Using FBO's and glReadPixels

Sadly, I think this approach might not be appropriate for you since you're creating some kind of video with the texture data and holding onto every pixel buffer that we've cached eats up a lot of memory. Another approach could be to create an FBO on the fly in order to use glReadPixels to populate your pixel buffer. I'm not too sure how successful that approach will be but a good example was posted here: Read texture bytes with glReadPixels?

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top