I presume you're talking about pixel processing with fragment shaders?
With the OpenGL ES 2.0 core API, you can't get pixels from the destination framebuffer into the fragment shader without reading them back from the GPU.
But if you're on a device/platform that supports a shader framebuffer fetch extension (EXT_shader_framebuffer_fetch
on at least iOS, NV_shader_framebuffer_fetch
in some other places), you're in luck. With that extension, a fragment shader can read the fragment data from the destination framebuffer for the fragment it's rendering to (and only that fragment). This is great for programmable blending or pixel post-processing effects because you don't have to incur the performance penalty of a glReadPixels
operation.
Declare that you're using the extension with #extension GL_EXT_shader_framebuffer_fetch : require
, then read fragment data from the gl_LastFragData[0]
builtin. (The subscript is for the rendering target index, but you don't have multiple render targets unless you're using OpenGL ES 3.0, so it's always zero.) Process it however you like and write to gl_FragColor
or gl_FragData
as usual.