Question

I'm trying to perform a rather specific blending operation, but one that seems like it ought to be relatively simple. I'm displaying a luminance texture on top of a background with an alpha value of one.

For example, this could be a simple checkerboard on top of another image:

enter image description here

I want the texture to linearly increment/decrement the luminance of the background according to:

(R_b + R_t * w, G_b + G_t * w, B_b + B_t * w, 1.0)

where *_b are the background pixel values (between 0 and 1), *_t are signed pixel values for the overlaid texture (between -1 and 1), and w is a weighting parameter that can also vary between -1 and 1. The goal is that by varying the sign and magnitude of w I can smoothly vary the magnitude and polarity of the modulation of the background by the checkerboard texture.

To give you a graphical example:

enter image description here

Using glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA) and just varying the alpha value of my overlay won't quite do what I want, because I can't use negative alpha values to invert the polarity of my overlay.

Is there a way to do this type of linear additive/subtractive blending using the fixed pipeline, or failing that, would it be possible to implement this using a shader?

Update

For further clarification I've added a crude PyOpenGL example of what I'm trying to do here.

Update 2

To be totally clear, if w > 0, I want to increase the luminance of every pixel within a white squares on my checkerboard, and decrease the luminance of every pixel within a black square. If w < 0 I want to do the opposite.

derhass's answer comes very close to what I'm looking for, but only affects the pixels in the white squares and not the black squares:

enter image description here

Humans are terrible at judging absolute luminance, but the difference is clearer if I set the color of the overlay to red (glColor4f(1., 0., 0., 0.25)):

enter image description here

As you can see, this results in adding/subtracting red from the white squares, but does not change the black squares.

Was it helpful?

Solution

I think you can use GL's blending as long as your w parameter will not vary per pixel, but is just global to the whole draw call of the overlay texture (as your images also suggest). You just need to set up the blend equation depending on the sign of w:

Set the Blend factors to glBlendFunc(GL_SRC_ALPHA,GL_ONE), and

As described so far, this will also modify the alpha component of the output, but you want that to be 1.0. As you said, your frambeuffer has already alpha=1 when you render the background. So you can just use glBlendFuncSeparate(GL_SRC_ALPHA,GL_ONE,GL_ZERO,GL_ONE) which in conjunction with the above equations will set the resulting alpha to that of the destination, A_b in your notation.

UPDATE

It turned out that the formula in the question is not the desired behavior, and the assumption that only the GL_FUNC_ADD or GL_FUNC_REVERSE_SUBTRACT is required per draw call does not hold.

However, it is possible to use negative alpha values, if a floating point color buffer is used (ARB_color_buffer_float).

If the color buffer is fixed-point, the components of the source and destination values and blend factors are each clamped to [0; 1] or [-1; 1] respectively for an unsigned normalized or signed normalized color buffer prior to evaluating the blend equation. If the color buffer is floating-point, no clamping occurs. The resulting four values are sent to the next operation.

Note that this extension introduces the glClampColor() function and defines the targets GL_CLAMP_READ_COLOR, GL_CLAMP_VERTEX_COLOR and GL_CLAMP_FRAGMENT_COLOR. The latter two are deprecated in modern GL and relevant only for the deprecated fixed function pipeline.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top