Question

I am developing an augmented reality app, that should render a 3D model. So far so good. I am using Vuforia for AR, libgdx for graphics, everything is on Android, works like charm...

Problem is, that I need to create a "window - like" effect. I literally need to make the model look like a window you can look through and see behind it. That means I have some kind of wall-object, which has a hole in it(a window). Through this hole, you can see another 3D model behind the wall.

Side View User View

Problem is, I need to also render the video background. And this background is also behind the wall. I can't just turn of blending when rendering the wall, because that would corrupt the video image.

So I need to make the wall and everything directly behind it transparent, but not the video background.

Is such marvel even possible using only OpenGL?

I have been thinking about some combination of front-to-end and back-to-front rendering: render background first, then render the wall, but blend it only into the alpha channel (making video visible only on pixels that are not covered by wall), then render the actual content, but blend it only into the visible pixels (that are not behind the wall) and then "render" the wall once more, but this time make everything behind it visible. Would such thing work?

Was it helpful?

Solution

I can't just turn of blending when rendering the wall

What makes you think that? OpenGL is not a scene graph. It's a drawing API and everything happens in the order and as you call it.

So order of operations would be

  1. Draw video background with blending turned off.

  2. The the objects between video and the wall (turn blending on or off as needed)

  3. Draw the wall, with blending or alpha test enabled, so that you can create the window.

Is such marvel even possible using only OpenGL?

The key in understanding OpenGL is, that you don't think of using it to setup a 3D world scene, but instead use it to draw a 2D picture of a 3D world (because that's what OpenGL actually does). In the end OpenGL is just a bit smarter brush to draw onto a flat canvas. Think about how you'd paint a picture on paper, how you'd mask different parts. And then you do that with OpenGL.

Update

Ohkay, now I see what you want to achieve. The wall is not really visible, but a depth dependent mask. Easy enough to achieve: Use alpha testing instead of blending to produce the window in the depth buffer. Or, instead of alpha testing you could just draw 4 quads, which form a window between them.

The trick is, that you draw it into just the depth buffer, but not into the color buffer.

glDepthMask(1);
glColorMask(0,0,0,0);
draw_wall();

Blending will not work in this case, since even fully transparent fragments will end up in the depth buffer. Hence alpha test. In fixed function OpenGL glEnable(GL_ALPHA_TEST) and glAlphaFunc(…). However on OpenGL-ES2 you've to implement it through a shader.

Say you've got a single channel texture, in the fragment shader do

float opacity = texture(sampler, uv).r;
if( opacity < threshold ) discard;
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top