문제

I am working on a view controller which holds two UIImageViews. The bottom image view holds a photo (taken or selected by the user). The second image view is positioned on top of this, taking up the whole screen too.

The second image view contains a pointer (green dot in the example below). It is used as a moveable view which can be positioned on the screen. It's use would be to mark a position/point on the photo behind it.

In Portrait this works fine, the bottom image view is set to image = Aspect Fill. Therefore the taken photo takes up the whole screen.

In Landscape orientation this doesn't work as well. I can explain a little better with a

badly drawn image below

enter image description here

As an example, if the user selects a point on the portrait view (shown by the green dot) the approximate location is 220,380.

However, if that same position is on landscape it would not be the same position on the photo behind.

The point would translate to more like 280,300.

So the question is... when the VC is orientation landscape, how do I determine the height and width of the image (from within the bottom image view) to work out the same point?

OR is there another method/approach for accomplishing this?

-------EDIT ---------- I have created a test app with just this functionality in. I have the same setup as detailed above. I have added logs to the view, and change orientation to landscape. Here are the logs I am using:

NSLog(@"bottom image view width: %f",self.photoImageView.frame.size.width);
NSLog(@"bottom image view height: %f",self.photoImageView.frame.size.height);
NSLog(@"bottom image view frame bounds X: %f",self.photoImageView.frame.origin.x);
NSLog(@"bottom image view frame bounds Y: %f",self.photoImageView.frame.origin.y);
NSLog(@"bottom image view bounds X: %f",self.photoImageView.bounds.origin.x);
NSLog(@"bottom image view bounds Y: %f",self.photoImageView.bounds.origin.y);
NSLog(@"---------BOTTOM IMAGE VIEW IMAGE DETAILS---------");
NSLog(@"bottom image view image width: %f",self.photoImageView.image.size.width);
NSLog(@"bototm image view image height: %f",self.photoImageView.image.size.height);
NSLog(@"bottom image view image frame bounds X: %f",self.photoImageView.image.accessibilityFrame.origin.x);
NSLog(@"buttom image view image frame bounds Y: %f",self.photoImageView.image.accessibilityFrame.origin.y);

These are the results:

2013-07-30 14:58:23.013 SelectPhotoViewTest[3414:c07] ---------VIEW DETAILS---------
2013-07-30 14:58:23.014 SelectPhotoViewTest[3414:c07] ---------BOTTOM IMAGE VIEW DETAILS---------
2013-07-30 14:58:23.015 SelectPhotoViewTest[3414:c07] bottom image view width: 480.000000
2013-07-30 14:58:23.016 SelectPhotoViewTest[3414:c07] bottom image view height: 268.000000
2013-07-30 14:58:23.016 SelectPhotoViewTest[3414:c07] bottom image view frame bounds X: 0.000000
2013-07-30 14:58:23.017 SelectPhotoViewTest[3414:c07] bottom image view frame bounds Y: 0.000000
2013-07-30 14:58:23.018 SelectPhotoViewTest[3414:c07] bottom image view bounds X: 0.000000
2013-07-30 14:58:23.018 SelectPhotoViewTest[3414:c07] bottom image view bounds Y: 0.000000
2013-07-30 14:58:23.019 SelectPhotoViewTest[3414:c07] ---------BOTTOM IMAGE VIEW IMAGE DETAILS---------
2013-07-30 14:58:23.019 SelectPhotoViewTest[3414:c07] bottom image view image width: 258.000000
2013-07-30 14:58:23.019 SelectPhotoViewTest[3414:c07] bototm image view image height: 480.000000
2013-07-30 14:58:23.020 SelectPhotoViewTest[3414:c07] bottom image view image frame bounds X: 0.000000
2013-07-30 14:58:23.021 SelectPhotoViewTest[3414:c07] buttom image view image frame bounds Y: 0.000000

How do I get from these results to determine where the center point is in relation to the bottom image coordinates?

도움이 되었습니까?

해결책

You actually have to translate the point into the image's coordinate system in both orientations. In the example you gave, it works for portrait only because the image has the same aspect ration as the screen. If you had a square image, the point would have no 1:1 relation to the image and could be placed outside the image just as easily.

I can think of two methods to accomplish your goal off the top of my head:

1) Instead of having the bottom imageview take up the whole screen and have it aspect fill, resize it manually to take up either the whole width or height of the screen, depending on the image. I.e. if you have a 1000x1000 image, you want the image view to be 320x320 for portait (iphone3) or 300x300 for landscape. Then, make second image a subview of the first. Its location will now be relative to the bottom image's coordinate system.

2) Use some simple math to translate the point on screen into the image's coordinate system. You know size and aspect ratio of the bottom image view, and you know the same about the image.

Let's say the image is 640x920. In landscape, the bottom image view will be 480x300.

1) Scale the image to size it will be in the image view (209x300)

2) Since the image will be centered, it will start around 136 points from the left

3) A point of (280, 300) on the screen would therefore translate to (144, 280 [-20 points for the status bar]) relative to the smaller image, which translates to (441, 859) in the full image's coordinate system.

I hope this points you in the right direction.

EDIT:

Ok, working of your example logs:

[btw, you can print out a CGRECT more easily like this: NSLog(@"%@", NSStringFromCGRect(self.photoImageView.frame))]

1) scale the image dimension so it would fit into the image view:

ratio = photoImageView.frame.size.height / photoImageView.image.size.height;
height = photoImageView.image.size.height * ratio;
width = photoImageView.image.size.width * ratio;

2) calculate the frame of the image inside the image view

x = (photoImageView.frame.size.width - width) / 2;
y = 0;
frame = CGRectMake(x, y, width, height);

3) assuming you use [touch locationInView:photoImageView] to get the location point, you can check if the touch is within the image frame with

CGRectContainsPoint(frame, location)

EDIT - include actual code used - slightly different to above handling the orientation of the image view as well.

if (self.photoImageView.frame.size.height < self.photoImageView.frame.size.width ) {
    // This is device and image view is landscape
    if (self.pushedPhoto.size.height < self.pushedPhoto.size.width) {
        // The pushed photo is portrait
        self.photoImageViewRatio = self.photoImageView.frame.size.height / self.pushedPhoto.size.height;
    } else {
        // The pushed photo is landscape
        self.photoImageViewRatio = self.photoImageView.frame.size.width / self.pushedPhoto.size.width;
    }

} else {
    // This is device and image view is portrait
    if (self.pushedPhoto.size.height < self.pushedPhoto.size.width) {
        // The pushed photo is portrait
        self.photoImageViewRatio = self.photoImageView.frame.size.width / self.pushedPhoto.size.width;

    } else {
        // The pushed photo is landscape
        self.photoImageViewRatio = self.photoImageView.frame.size.height / self.pushedPhoto.size.height;
    }
}

self.valueHeight = self.pushedPhoto.size.height * self.photoImageViewRatio;
self.valueWidth = self.pushedPhoto.size.width * self.photoImageViewRatio;




float x =  (self.photoImageView.frame.size.width - self.valueWidth) / 2;
float y = (self.photoImageView.frame.size.height - self.valueHeight) /2;
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top