Question

I understand gamma topic but maybe not 100% and want to ask somebody to answer and clarify my doubts. As I understand

  1. there is a natural linear color space where color with value 100 is exactly four times brighter then color with value 25 and so on (it would be good to hold and process color images in that format/space
  2. when you will copy such linear light image onto device you would get wrong colors becouse generally some medium values would appear to dark sou you generally need to rise up this middle values by something like x <- power(x, 1/2.2) (or something)

this is reasonably clear, but now

  1. are the normal everyday bitmap or jotpeg images gamma precomputed ? If they are when i would do some things al blending and so should i do some inverted-gamma to boath them to convert them to linear then add them and then gamma-correct them to result?

  2. when out of scope would apperr it is better co cut colors or resacle it linearly into more wide range or something else?

tnx for answers

Was it helpful?

Solution

There's nothing "natural" about linear color values. Your own eyes respond to brightness logarithmically, and the phosphors in televisions and monitors respond exponentially.

GIF and JPEG files do not specify a gamma value (JPEG can with extra EXIF data), so it's anybody's guess what their color values really mean. PNG does, but most people ignore it or get it wrong, so they can't be trusted either. If you're displaying images from a file, you'll just have to guess or experiment. A reasonable guess is to use 2.2 unless you know the file came from an Apple device, in which case use 1.0 (Apple devices are generally linear).

If you need to store images accurately, Use JPEG from a camera with good EXIF data embedded for photos, and maybe something like TIFF for uncompressed images. And use high-quality software that understands these things. Unfortunately, that's pretty rare and/or expensive.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top