Question

Ok so I am trying to use WebGL's readPixels method to get the data rendered to a framebuffer from my shaders. The technique is simple enough:

renderer.render(sceneRTT, cameraRTT, rtTexture, true);

var pixels = new self.Uint8Array(width * height * 4);

var gl = renderer.context;
gl.bindFramebuffer(gl.FRAMEBUFFER, rtTexture.__webglFramebuffer);
gl.readPixels(0, 0, width, height, gl.RGBA, gl.UNSIGNED_BYTE, pixels);

I am testing this by changing the gl_FragCoord of my shader to gl_FragCoord = vec4(255,120,35,100), though any values are valid of course.

Here's the problem, pixels always returns incorrect values, even when I specify a constant value like above! What could be causing this?

Note: this problem shows itself on every shader I tested, including other people's working fiddles

Was it helpful?

Solution

Ok so after playing around with some test values I figured out what the problem was.

The values reurned in the RGB columns are premultiplied by alpha.

This is because the premultipliedAlpha flag of the WebGLRenderer is by default set to true.

To fix this behavior, set:

var renderer = new THREE.WebGLRenderer( { premultipliedAlpha : false } ); 

during construction of the renderer being used.

Also, make sure that when you create the material to be applied to the framebuffer you set its blending to none:

material.blending = 0;

OTHER TIPS

maybe, this too? gl = canvas.getContext( "webgl", {alpha : false, antialias : false})

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top