Question

I'm developing an OpenGL application for Windows XP. The target machine has 2 NVIDIA GeForce 9800GT video cards, which are needed because the application needs to have output 2 streams of analog video.

The application itself has two OpenGL windows, one for each video card. Each video card is connected to one monitor. As for the code, it's based on a minimal OpenGL example.

How can I know if the application is utilizing both video cards for rendering?

At the moment, I don't care if the application only runs on Windows XP or only with NVIDIA video cards, I just need to know how the two are working.

Was it helpful?

Solution

I think you need to read up on the WGL_nv_gpu_affinity extension. You create affinity masks and use wglMakeCurrent() in conjunction with them. Here are some pointers:

http://developer.download.nvidia.com/opengl/specs/WGL_nv_gpu_affinity.txt

Pdf from NVidia.com

Cheers !

OTHER TIPS

I beleive you can gain such information from the gDEBugger for OpenGL based applications.

If it turns out your not using both cards, you can check out Equalizer for parallel rendering, it's a great project.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top