Question

I'm currently using OpenGL to display my QTMovie's frames into an NSOpenGLView. In order to do that, I'm creating a texture context with the following code:

// Called from a subclass of NSOpenGLView
QTOpenGLTextureContextCreate(kCFAllocatorDefault,
                             (CGLContextObj)[[self openGLContext] CGLContextObj],
                             (CGLPixelFormatObj)[[self pixelFormat] CGLPixelFormatObj],
                             NULL, &_textureContext);

I then assign the visual context to the movie so it draws the frames into that context. I'm getting the actual frame with this bit:

OSStatus status = QTVisualContextCopyImageForTime(_textureContext, kCFAllocatorDefault,
                                                      NULL, &_currentFrameTex);

where _currentFrameTexis a CVOpenGLTextureRef.

All of that works fine for one view, but I'd like to draw the same frame on a second view and I can't imagine how to do that.

What would be the easiest way to do that? Of course it should also be performant as it is called 60 times per second (at least).

Was it helpful?

Solution

I would suggest that you use context sharing when you create your NSOpenGLViews.

Unfortunately, I do not have a lot of experience with the cocoa (NeXTSTEP) OpenGL window system interface (I use CGL/AGL), so I cannot tell you exactly how to do this. However, the API reference for initWithFormat:shareContext may get you pointed in the correct direction.

The reason I suggest context sharing is because you can re-use the texture handle that CVOpenGLTextureRef gives you in both NSOpenGLView instances. Otherwise, you will probably have to get the CVOpenGLTextureRef per-context (view).

The overhead of doing this per-context may not be as high as you think, Core Video does not have to copy the contents of the video through system memory to give you an OpenGL texture - it already has a surface on the GPU that it can copy/reference.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top