The problem comes from the difference between the orientation in UIImage and CIDetectorImageOrientation. From iOS docs:
CIDetectorImageOrientation
A key used to specify the display orientation of the image whose features you want to detect. This key is an NSNumber object with the same value as defined by the TIFF and EXIF specifications; values can range from 1 through 8. The value specifies where the origin (0,0) of the image is located. If not present, the default value is 1, which means the origin of the image is top, left. For details on the image origin specified by each value, see kCGImagePropertyOrientation.
Available in iOS 5.0 and later.
Declared in CIDetector.h.
You have to specify the CIDetectorImageOrientation. Here is what I did:
int exifOrientation;
switch (self.image.imageOrientation) {
case UIImageOrientationUp:
exifOrientation = 1;
break;
case UIImageOrientationDown:
exifOrientation = 3;
break;
case UIImageOrientationLeft:
exifOrientation = 8;
break;
case UIImageOrientationRight:
exifOrientation = 6;
break;
case UIImageOrientationUpMirrored:
exifOrientation = 2;
break;
case UIImageOrientationDownMirrored:
exifOrientation = 4;
break;
case UIImageOrientationLeftMirrored:
exifOrientation = 5;
break;
case UIImageOrientationRightMirrored:
exifOrientation = 7;
break;
default:
break;
}
NSDictionary *detectorOptions = @{ CIDetectorAccuracy : CIDetectorAccuracyHigh };
CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
NSArray *features = [faceDetector featuresInImage:[CIImage imageWithCGImage:self.image.CGImage]
options:@{CIDetectorImageOrientation:[NSNumber numberWithInt:exifOrientation]}];
After the feature is detected, you also need to map the coordinate into the uiimage view, use my gist here: https://gist.github.com/laoyang/5747004 to convert the coordinate system