Question

I am making a "paint"-like app that allows the user to do freehand drawings on an image. It's based on the SimpleDrawing app by Nathanial Woolls.

The user should also be able to pan and zoom. So I have a button that turns zoom on/off. When zoom is on, double touch on the screen should zoom and single touch drag should pan. When it's off, the active drawing tool acts on the image.

This is my view hierarchy:

- ViewController.view
 |- Top toolbar
 |- bottom toolbar
 |- scroll view
   |- drawingView
     |- N other views depending on elements user adds
     |- ...

The scroll view handles pan and zoom as such:

-(void)scrollViewDidEndZooming:(UIScrollView *)scrollView withView:(UIView *)view atScale:(CGFloat)scale
{
     [drawingView setNeedsDisplay];
     [scrollView setNeedsDisplay];
}

-(UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView
{
     return drawingView;
}

To turn the zoom on/off, I set its min/max zoom scale properties (both to 1 to turn off, and to 0.5/3 to turn on) along with the userInteractionEnabled property.

The responder responsible for handling touch events on the drawing view and activating the drawing tools is the view controller.

Here's my problem: When the view is not zoomable, drag gestures should pass through to the "leaf node" views. The drawingView has subviews like UITextFields, which the user can add to the drawing to type, or UIImageViews, used for inserting preset symbols. The user should be able to drag these elements around when the zoom is off. But, if I set the scrollView's userInteractionEnabled property to NO, all drag gestures are handled by the view controller (and thus activate the current drawing tool), but if I keep it on YES when zooming is off, it'll block the view controller from doing anything and any drag on the screen will pan the view.

My Understanding of Hit Tests and Responder Chain

I've been reading Apple's documentation on Event Handling to understand the process. My understanding from the docs is that when a touch occurs, the system will "walk down the view hierarchy", looking for the "leaf node" view that was touched (the hit test process); then starting from that view, it "walks up" the responder chain to find a responder that can handle the event. My understanding is that if at some point in the hit test process, the system finds a view with userInteractionEnabled set to NO, it will stop there and start walking up the responder chain from there. Is this correct? In that case, why is it that when the scrollView's userInteractionEnabled is YES, the scrollView handles drag events and not its subviews (since all the subviews of drawingView implement touchesMoved:withEvent)?

Was it helpful?

Solution

When you disable zooming you should also disable scrolling this will stop panning from happening. There is no need to set userInteractionEnabled to NO. Instead do scrollView.scrollEnabled = NO

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top