Stupid mistake. I only needed to normalize the vertex color in the vertex shader as I'm passing unsigned bytes in.
Blending issue porting from OpenGLES 1.0 to 2.0 (iOS)
-
02-06-2022 - |
Domanda
I'm porting a very simple piece of code from OpenGLES 1.0 to OpenGLES 2.0.
In the original version, I have blending enabled with
glEnable(GL_BLEND);
glBlendEquation(GL_FUNC_ADD);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
I'm using the same code in my ES 2.0 implementation as I need to blend the newly rendered quads with what was in the render buffer (I'm retaining the render buffer, I can't re-render the scene).
I'm using a texture (alpha values displaying a radial gradient from center to the outside, alpha goes from 1 to 0) that serves as an alpha mask, containing only white pixels with different alpha values. I give my vertices the same color say red with alpha of 100/255. My background is transparent black. Below that, I have a plain white surface (UIView). I render 4 quads.
Result with OpenGLES 1.0 (desired result)
My observations tells me that the fragment shader should simply be:
gl_FragColor = DestinationColor * texture2D(Texture, TexCoordOut);
(I got to that conclusion by trying different values for my vertices and the texture. That's also what I've read on some tutorials.)
I'm trying to write some OpenGL 2.0 code (including vertex + fragment shaders) that would give me the exact same result as in OpenGLES 1.0, nothing more, nothing less. I don't need/want to do any kind of blending in the fragment shader except applying the vertex color on the texture. Using the simple shader, here's the result I got:
I tried pretty much every combination of *, +, mix I could think of but I couldn't reproduce the same result. This is the closest I got so far, but that's definitely not the right one (and that doesn't make any sense either)
varying lowp vec4 DestinationColor;
varying lowp vec2 TexCoordOut;
uniform sampler2D Texture;
void main(void) {
lowp vec4 texture0Color = texture2D(Texture, TexCoordOut);
gl_FragColor.rgb = mix(texture0Color.rgb, DestinationColor.rgb, texture0Color.a);
gl_FragColor.a = texture0Color.a * DestinationColor.a;
}
This shader gives me the following:
Soluzione 3
Altri suggerimenti
By reading this and this, one can construct the blending function.
Since you're using glBlendFunc
and not glBlendFuncSeparate
, all the 4 channels are being blended. Using the GL_FUNC_ADD
parameter sets the output O to O = sS + dD
, where s
and d
are the blending parameters, and S
and D
are the source and destination colors.
The s
and d
parameters are set by the glBlendFunc
. GL_ONE
sets s
to (1, 1, 1, 1)
, and GL_ONE_MINUS_SRC_ALPHA
sets d
to (1-As, 1-As, 1-As, 1-As)
, where As
is the alpha value of the source. Therefore, your blend is doing this (in vector form):
O = S + (1-As) * D
which in GLSL
is O = mix(D, S, As)
, or:
gl_FragColor = mix(DestinationColor, TexCoordOut, As)
.
If the result doesn't look similar, then please verify that you're not using glBlend
or any other OpenGL APIs that may change the appearance of your final result. If that doesn't help, please post a screenshot of the different outputs.
this can't be done easily with shaders since blending have to read current framebuffer state. You can achieve this with rendering to texture and passing it to shader, if you can get color would be in framebuffer then you are ok.
your equation is:
gl_FragColor = wouldBeFramebufferColor + (1-wouldBeFramebufferColor.a) * texture0Color;
but why do you want to do it in shaders AFAIK blending was not removed in OpenGL ES 2.0