Question

I am having a very hard time finding any documentation online that clearly explains how to implement Core Image's CIPerspectiveTransform filter properly. In particular, when setting CIVector values for inputTopLeft, inputTopRight, inputBottomRight, and inputBottomLeft, what are these vectors doing to the image? (I.e., what is the math behind how these vectors warp my image?)

Currently this is the code I am using. It doesn't crash, but it doesn't show an image:

CIImage *myCIImage = [[CIImage alloc] initWithImage:self.image];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:@"CIPerspectiveTransform" keysAndValues:@"inputImage", myCIImage, @"inputTopLeft", [CIVector vectorWithX:118 Y:484], @"inputTopRight", [CIVector vectorWithX:646 Y:507], @"inputBottomRight", [CIVector vectorWithX:548 Y:140], @"inputBottomLeft", [CIVector vectorWithX:155 Y:153], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *transformedImage = [UIImage imageWithCGImage:cgimg];
[self setImage:transformedImage];
CGImageRelease(cgimg);

Other things to note that might be important:

  • My UIImageView (75pts x 115 pts) is already initialized via awakeFromNib and already has an image (151px x 235px) associated with it.

  • The above code is being implemented in the UIImageView's - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event function. The hope is that I'll be able to adjust the perspective of the image based on screen coordinates so it looks like the image is moving in 3D space.

  • This code is for an iPhone app.

Again, the question I think I am asking is what the various parameter vectors do, but I may be asking the wrong question.

The following post is very similar but is asking why his image disappears rather than how to use CIVectors for CIPerspectiveTransform. It also has received very little traction, perhaps because it is too general: How I can use CIPerspectiveTransform filter

Was it helpful?

Solution

As I commented on the linked question, CIPerspectiveTransform is not available in the iOS implementation of Core Image as of iOS 5.1. That's why you and the other asker weren't seeing any image as a result, because most likely your CIFilter was nil.

If you just want to implement a form of perspective on an image, there are two different fast ways of doing this on iOS, as I describe in this answer. One is to simply use the right kind of CATransform3D on the layer of a UIImageView, but this is only useful for display, not for image adjustment.

The second way is to manipulate the image using an appropriate 3-D transformation matrix in OpenGL ES. As I indicate in the above-linked answer, I have an open source framework that wraps all this, and the FilterShowcase sample there has an example of applying a perspective to incoming video. You can easily swap out the video input with your image, and grab an image from that after the perspective effect is applied.

OTHER TIPS

Ooooold post but I bet someone will still look for this kind of filter:

The following func applies the filter to an image and returns the same image with perspective trying to create a reflection effect with perspective. (The following image is just an example and not the result of real testing)

Example of mirror effect

This is an example so you need some adjustments. The purpose is to create an image that seems to be a reflection of the "main" one.

So I added two image views to my view as in the image. Added constraints, etc.

Then in the func, you will see that the coordinates are hardcoded so the returning image will perfectly match its container, the bottom UIImageView.

enter image description here

func applyperspectiveTransform(imagen: UIImage) -> UIImage
{

      var context = CIContext()
    var outputImage = CIImage()
    var newUIImage = UIImage()

    let leftTop     = CGPoint(x: self.ImgView.frame.origin.x, y: self.ImgReflection.frame.origin.y + self.ImgReflection.frame.size.height)
    let leftBottom  = CGPoint(x: self.ImgReflection.frame.origin.x, y: self.ImgView.frame.origin.y + self.ImgView.frame.size.height)
    let rightTop    = CGPoint(x: self.ImgView.frame.origin.x + self.ImgView.frame.size.width, y: self.ImgReflection.frame.origin.y + self.ImgReflection.frame.size.height)
    let rightBottom = CGPoint(x: self.ImgReflection.frame.origin.x + self.ImgReflection.frame.size.width, y: self.ImgView.frame.origin.y + self.ImgView.frame.size.height)

    let lT = CIVector(x: leftTop.x, y: leftTop.y)
    let lB = CIVector(x: leftBottom.x, y: leftBottom.y)
    let rT = CIVector(x: rightTop.x, y: rightTop.y)
    let rB = CIVector(x: rightBottom.x, y: rightBottom.y)

    let aCIImage = CIImage(image: imagen)
    context = CIContext(options: nil);

    let perspectiveTransform = CIFilter(name: "CIPerspectiveTransform")!

    perspectiveTransform.setValue(lB,forKey: "inputTopLeft")
    perspectiveTransform.setValue(rB,forKey: "inputTopRight")
    perspectiveTransform.setValue(rT,forKey: "inputBottomRight")
    perspectiveTransform.setValue(lT,forKey: "inputBottomLeft")
    perspectiveTransform.setValue(aCIImage,forKey: kCIInputImageKey)

    outputImage = perspectiveTransform.outputImage!
    let cgimg = context.createCGImage(outputImage, from: outputImage.extent)
    newUIImage = UIImage(cgImage: cgimg!)

    return newUIImage
}

For this particular effect (mirror) you might want to add some more affects, like a gradient, change the alpha, etc

Happy coding! Hope it helps

Hope it helps!

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top