I am trying to add face detection to my app, and the code I added gives me back a CGRect that has nothing to do with the face.

Here's the code

CIImage  *cIImage = [CIImage imageWithCGImage:self.imageView.image.CGImage];
CIDetector* faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace
                                          context:nil options:[NSDictionary  
       dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
NSArray *features = [faceDetector featuresInImage:cIImage];
for(CIFaceFeature* faceObject in features)
{
    FaceLocation.x = faceObject.bounds.origin.x;
    FaceLocation.y = faceObject.bounds.origin.y;
}// Here face location is far off away form the actual face

But this code gives me a location far away from the actual face, what am I doing wrong here?

有帮助吗?

解决方案

The problem comes from the difference between the orientation in UIImage and CIDetectorImageOrientation. From iOS docs:

CIDetectorImageOrientation

A key used to specify the display orientation of the image whose features you want to detect. This key is an NSNumber object with the same value as defined by the TIFF and EXIF specifications; values can range from 1 through 8. The value specifies where the origin (0,0) of the image is located. If not present, the default value is 1, which means the origin of the image is top, left. For details on the image origin specified by each value, see kCGImagePropertyOrientation.

Available in iOS 5.0 and later.

Declared in CIDetector.h.

You have to specify the CIDetectorImageOrientation. Here is what I did:

int exifOrientation;
switch (self.image.imageOrientation) {
    case UIImageOrientationUp:
        exifOrientation = 1;
        break;
    case UIImageOrientationDown:
        exifOrientation = 3;
        break;
    case UIImageOrientationLeft:
        exifOrientation = 8;
        break;
    case UIImageOrientationRight:
        exifOrientation = 6;
        break;
    case UIImageOrientationUpMirrored:
        exifOrientation = 2;
        break;
    case UIImageOrientationDownMirrored:
        exifOrientation = 4;
        break;
    case UIImageOrientationLeftMirrored:
        exifOrientation = 5;
        break;
    case UIImageOrientationRightMirrored:
        exifOrientation = 7;
        break;
    default:
        break;
}

NSDictionary *detectorOptions = @{ CIDetectorAccuracy : CIDetectorAccuracyHigh };
CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];

NSArray *features = [faceDetector featuresInImage:[CIImage imageWithCGImage:self.image.CGImage]
                                          options:@{CIDetectorImageOrientation:[NSNumber numberWithInt:exifOrientation]}];

After the feature is detected, you also need to map the coordinate into the uiimage view, use my gist here: https://gist.github.com/laoyang/5747004 to convert the coordinate system

其他提示

iOS 10 and Swift 3

If you not interest in face features

You can check apple example you can detect face it's work very well with Orientation

you can select the face metedata to make camera track the face and show yellow box on the face

its have good performance than this example

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top