Question

As an exercise with accessibility and a personal challenge to myself I decided that I'd like to write a relatively simple app.

The app would show an MKMapView of the United States and when you tap anywhere on it, it uses an MKReverseGeocoder to show you the locality, state and country where you tapped. This works fine, although I have to hijack the touch events by adding a WildcardGestureRecognizer to the MKMapView. This works great with VoiceOver turned off.

When I turn VoiceOver on and tap on the map, it says "map". If I double tap it makes it little clicky noise which indicates that you've activated it. To be honest, I'm at a loss for how to intercept these events. I know the general solution is to put a transparent view above the whole screen and pass down touches, but will that work with VoiceOver?

For the record, the WildcardGestureRecognizer I'm using is found here: How to intercept touches events on a MKMapView or UIWebView objects?

Was it helpful?

Solution

The problem is that when you turn on VoiceOver, touch events are blocked by the system. To prove it, put a trace in your touchesBegan function. It should fire fine until you turn on VoiceOver.

The little clicky sound you hear when you double tap is the VoiceOver over-ride gesture. VoiceOver has its own set of gestures but you can over-ride VoiceOver's gestures with a double-tap-hold.

For example, swiping down does not scroll a page with VoiceOver on. But if you double-tap-hold, wait for the clicky sound and then swipe down, it will scroll.

OTHER TIPS

iOS 5.0 added this capability - just set the UIAccessibilityTraitAllowsDirectInteraction trait on the view:

mapView.accessibilityTraits |= UIAccessibilityTraitAllowsDirectInteraction;

(if this does not work, you might instead need to subclass the map view and override the accessibilityTraits method)

The first touch on that element will still make the "click" sound and announce the name of the view etc., but all subsequent touches will be direct interactions - i.e. the touch events will be passed to the map view.

Example of this in use in one of Apple's iOS apps is GarageBand - the piano view has this trait set so that the blind person can play the keys on the piano keyboard by directly touching them.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top