Question

Right now I'm working on an image processing flow backed with GraphicsMagick (ImageMagick independent fork) and I want to add optional blur into it.

But people who'll work with it not familiar with Gaussian blur and its radius and sigma parameters. So they asked me to accept blur percentage instead.

I searched for heuristics behind blur percentage, but I only found that it's an "amount of image taken into consideration for determining each pixels value".

So, the best solution I found is to calculate sigma based on the following rule:

6 \sigma = \frac{persentage}{100\%} * \sqrt{width^2 + height^2}

Does anyone know any better way to determine blur parameters based on its percentage?

Was it helpful?

Solution

Your solution looks reasonable. To get the "sigma" recommended by *Magick, which is 0.5 to 0.7, they'd have to put in percentages like 0.1 if they are working with images that measure 3000 or 4000 pixels diagonally. Radius is supposed to be an integer, 2 or 3 times the sigma. Sigma defaults to 1 if not given to either *Magick.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top