Question

I have a large 32bit integer texture (R32I), and I need to perform bilinear filtering on it. I naively thought that I'd simply have to enable the filtering on my texture, but it seems that the whole thing is not as simple. The OpenGL ES 3.0 specification has a list of texture formats on pages 129-131, and the R32I format is not marked as texture-filterable. All the integer format seems to be unfilterable, and floating point formats larger than 16 bit as well. I could use both, an integer or a floating point format, but 16 bit are simply not enough precision for my data.

Now I can of course perform the filtering manually in a shader, but I'm wondering if there are GPUs that can actually filter these formats and how I could detect if the GPU is able to do this?

Are there severe performance drawbacks to performing the bilinear filtering manually in a shader? Or is it merely convenience, then there would be no reason for me to try to make the automatic filtering work for my texture.

Était-ce utile?

La solution

Check if your GPU supports this extension, https://www.khronos.org/registry/gles/extensions/OES/OES_texture_float_linear.txt

It should do the trick. I think the new Motorola Nexus 6 with the new Adreno 420 should support it.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top