I am trying to use the iOS 7 style glass effect in my glass by applying image effects to a screenshot of a MKMapView. This UIImage category, provided by Apple, is what I am using as a baseline. This method desaturates the source image, applies a tint color, and blurs heavily using the input vals:

[image applyBlurWithRadius:10.0
                 tintColor:[UIColor colorWithRed:229/255.0f green:246/255.0f blue:255/255.0f alpha:0.33] 
     saturationDeltaFactor:0.66
                 maskImage:nil];

This produces the effect I am looking for, but takes way too long — between .3 and .5 seconds to render on an iPhone 4.

enter image description here

I would like to use the excellent GPUImage as my preliminary attempts have been about 5-10 times faster, but I just can't seem to get it right.

GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:image];

GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init];
saturationFilter.saturation = 0.33; // 1.0 - 0.66;
[stillImageSource addTarget:saturationFilter];

GPUImageMonochromeFilter *monochromeFilter = [[GPUImageMonochromeFilter alloc] init];
[monochromeFilter setColor:(GPUVector4){229/255.0f, 246/255.0f, 1.0f, 0.33f}];
[monochromeFilter setIntensity:0.2];
[saturationFilter addTarget:monochromeFilter];

GPUImageFastBlurFilter *blurFilter = [[GPUImageFastBlurFilter alloc] init];
blurFilter.blurSize = 2;
blurFilter.blurPasses = 3;
[monochromeFilter addTarget:blurFilter];

[saturationFilter prepareForImageCapture];
[monochromeFilter prepareForImageCapture];

[stillImageSource processImage];
image = [blurFilter imageFromCurrentlyProcessedOutput];

This produces an image which is close, but not quite there

enter image description here

The blur doesn't seem to be deep enough, but when I try to increase the blurSize above, it becomes grid-like, almost like a kaleidoscope. You can actually see the grid here by zooming in on the second image. The tint-color I am trying to mimic seems to just wash out the image instead of overlaying and blending, which I think the Apple sample is doing.

I have tried to setup the filters according to comments made by @BradLarson in another SO question. Am I using the wrong GPUImage filters to reproduce this effect, or am I just setting them up wrong?

有帮助吗?

解决方案

OK, I've been working on something here for a little while, and I finally have it functional. I just rolled a number of changes to GPUImage's blur filters into the framework, and as a result I believe I have a reasonable replica of Apple's blur effect that they use for things like the control center view.

Previously, the blurs that I had in the framework used a single precalculated radius, and the only way to affect their intensity was to tweak the spacing at which they sampled pixels from the input image. With a limited number of samples per pixel, changing the multiple for the spacing between sampled pixels much above 1.5 started introducing serious blocking artifacts as pixels were skipped.

The new Gaussian blur implementation that I've built combines the performance benefits of precalculated Gaussian weights with the ability to use an arbitrary radius (sigma) for the Gaussian blur. It does this by generating shaders on the fly as they are needed for various radii. It also reduces the number of texture samples required for a given blur radius by using hardware interpolation to read two texels at a time for each sample point.

The new GPUImageiOSBlurFilter combines this tuned arbitrary-radius Gaussian blur filter with a color-correction filter that appears to replicate the adjustment Apple performs to the colors after they've been blurred. I added the below comparison to my answer here, but it shows Apple's built-in blurring from the control center view on the left, and my new GPUImage blur filter on the right:

Apple's blur GPUImage's blur

As a way of improving performance (Apple's blur appears to occur with a sigma of 48, which requires quite a large area to be sampled for each pixel), I use a 4X downsampling before the Gaussian blur, then a 4X upsampling afterward. This reduces the number of pixels that need to be blurred by 16X, and also reduces the blur sigma from 48 to 12. An iPhone 4S can blur the entire screen in roughly 30 ms using this filter.

Getting the blur right is one thing. Apple still does not provide a fast way of getting the image content behind your views, so that most likely will be your bottleneck here for rapidly changing content.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top