Question

So, we've got a little graphical doohickey that needs to run in a server environment without a real video card. All it really needs is framebuffer objects and maybe some vector/font anti-aliasing. It will be slow, I know. It just needs to output single frames.

I see this post about how to force software rendering mode, but it seems to apply to machines that already have OpenGL enabled cards (like NVidia).

So, for fear of trying to install OpenGL on a machine three time zones away with a bunch of live production sites on it-- has anybody tried this and/or know how to "emulate" an OpenGL environment? Unfortunately our dev server HAS a video card, so I can't really show "what I've tried".

The relevant code is all in Cinder, but I think our actual OpenGL utilization is lightweight for this purpose.

This would run on windows server 2008 Standard

I see MS has a software implementation for OGL 1.1, but can't seem to find one for 2.0

Was it helpful?

Solution

Build/find some Mesa DLLs.

It will be slow.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top