Question

Conceptual question, just out of curiosity:

What is less taxing on the graphics processor: Anti-aliasing (2x? 4x? Higher?) on a typical desktop machine (around 120-150dpi) or to drive a hi-density (>300dpi) screen without anti-aliasing? This question could pertain to both desktop systems and embedded (smartphones). I'm interested to see the responses!

Was it helpful?

Solution

Neither usually, since font rendering and AA is done by the CPU (though you can use GPU features to blur). And then it depends on the font rasterizer and how good or bad it was implemented. It also depends on how AA was done, whether a matrix blur was applied, an FFT, or a simple render-bigger-and-bicubic-downsampling was used. Only runtime tests can show.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top