I'm experimenting with Core Image (on OS X, 10.7.3) for the first time and am running into a brick wall. I'm certain this is something silly I'm doing and just need someone more familiar with the framework to point it out to me.

Consider the following code (let's stipulate that imageURL is a valid file URL pointing to a JPG on disk):

CIImage *inputImage = [CIImage imageWithContentsOfURL:imageURL];
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues:
                                                  kCIInputImageKey, inputImage,
                                                  kCIInputExtentKey, [inputImage valueForKey:@"extent"],
                                                  nil];
CIImage *outputImage = (CIImage *)[filter valueForKey:@"outputImage"];

When running this code, the last line triggers:

0   CoreFoundation                      0x00007fff96c2efc6 __exceptionPreprocess + 198
1   libobjc.A.dylib                     0x00007fff9153cd5e objc_exception_throw + 43
2   CoreFoundation                      0x00007fff96cbb2ae -[NSObject doesNotRecognizeSelector:] + 190
3   CoreFoundation                      0x00007fff96c1be73 ___forwarding___ + 371
4   CoreFoundation                      0x00007fff96c1bc88 _CF_forwarding_prep_0 + 232
5   CoreImage                           0x00007fff8f03c38d -[CIAreaAverage outputImage] + 52
6   Foundation                          0x00007fff991d8384 _NSGetUsingKeyValueGetter + 62
7   Foundation                          0x00007fff991d8339 -[NSObject(NSKeyValueCoding) valueForKey:] + 392

Now, the Core Image Filter Reference clearly states that CIAreaAverage "Returns a single-pixel image that contains the average color for the region of interest." Indeed, even more baffling, when I examine the filter attributes in the debugger (before trying the valueForKey: call):

(lldb) po [filter attributes]
(id) $3 = 0x00007fb3e3ef0e00 {
    CIAttributeDescription = "Calculates the average color for the specified area in an image, returning the result in a pixel.";
    CIAttributeFilterCategories =     (
        CICategoryReduction,
        CICategoryVideo,
        CICategoryStillImage,
        CICategoryBuiltIn
    );
    CIAttributeFilterDisplayName = "Area Average";
    CIAttributeFilterName = CIAreaAverage;
    CIAttributeReferenceDocumentation = "http://developer.apple.com/cgi-bin/apple_ref.cgi?apple_ref=//apple_ref/doc/filter/ci/CIAreaAverage";
    inputExtent =     {
        CIAttributeClass = CIVector;
        CIAttributeDefault = "[0 0 640 80]";
        CIAttributeDescription = "A rectangle that specifies the subregion of the image that you want to process.";
        CIAttributeDisplayName = Extent;
        CIAttributeType = CIAttributeTypeRectangle;
        CIUIParameterSet = CIUISetBasic;
    };
    inputImage =     {
        CIAttributeClass = CIImage;
        CIAttributeDescription = "The image to process.";
        CIAttributeDisplayName = Image;
        CIUIParameterSet = CIUISetBasic;
    };
    outputImage =     {
        CIAttributeClass = CIImage;
    };
}

There's outputImage right there - given as type CIImage!

So, what am I doing wrong? All the docs and tutorials I have seen indicate that -valueForKey: is the correct way to access attributes, including outputImage.

有帮助吗?

解决方案

I believe your extents are the culprit (however strange it is). When I change the extents to be a CIVector* it works.

NSURL *imageURL = [NSURL fileURLWithPath:@"/Users/david/Desktop/video.png"];
CIImage *inputImage = [CIImage imageWithContentsOfURL:imageURL];
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage"];
[filter setValue:inputImage forKey:kCIInputImageKey];
CGRect inputExtent = [inputImage extent];
CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x
                                       Y:inputExtent.origin.y
                                       Z:inputExtent.size.width
                                       W:inputExtent.size.height];
[filter setValue:extent forKey:kCIInputExtentKey];
CIImage *outputImage = [filter valueForKey:@"outputImage"];

[inputImage extent] returns an CGRect, but apparently a CIVector* works better.

其他提示

Here's how I made CIAreaAverage work in an iOS app:

CGRect inputExtent = [self.inputImage extent];
CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x
                                       Y:inputExtent.origin.y
                                       Z:inputExtent.size.width
                                       W:inputExtent.size.height];
CIImage* inputAverage = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues:@"inputImage", self.inputImage, @"inputExtent", extent, nil].outputImage;

//CIImage* inputAverage = [self.inputImage imageByApplyingFilter:@"CIAreaMinimum" withInputParameters:@{@"inputImage" : inputImage, @"inputExtent" : extent}];
EAGLContext *myEAGLContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
NSDictionary *options = @{ kCIContextWorkingColorSpace : [NSNull null] };
CIContext *myContext = [CIContext contextWithEAGLContext:myEAGLContext options:options];

size_t rowBytes = 32 ; // ARGB has 4 components
uint8_t byteBuffer[rowBytes]; // Buffer to render into

[myContext render:inputAverage toBitmap:byteBuffer rowBytes:rowBytes bounds:[inputAverage extent] format:kCIFormatRGBA8 colorSpace:nil];

const uint8_t* pixel = &byteBuffer[0];
float red   = pixel[0] / 255.0;
float green = pixel[1] / 255.0;
float blue  = pixel[2] / 255.0;
NSLog(@"%f, %f, %f\n", red, green, blue);


return outputImage;
}
@end

The output will look something like this:

2015-05-23 15:58:20.935 CIFunHouse[2400:489913] 0.752941, 0.858824, 0.890196
2015-05-23 15:58:20.981 CIFunHouse[2400:489913] 0.752941, 0.858824, 0.890196
许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top