Question

I am writing a game for the iPhone/iPodTouch (using Cocos2d) and I have noticed something strange...

I need to detect the parts of the screen which the player touches in order to perform certain actions.

For example:

-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event {
CGPoint touchLocation = [touch locationInView:[touch view]];    
//DEBUG
CCLOG(@"Button pressed: X Location: %f",touchLocation.x);
CCLOG(@"Button pressed: Y Location: %f",touchLocation.y);

... And when touchLocation.x and touchLocation.y are both within a certain range, I do something.

It worked on the simulator fine. When I compiled it for my test device, using armv7 architecture (optimized) it also worked. But when I compiled it with armv6, armv7 architectures enabled (which is what the app store requires), I noticed that the code no longer worked. The CCLOGs showed me that the X,Y coordinate system used when I compiled for armv6,armv7 was DIFFERENT to that used when I only compiled for armv7 (optimized).

How can I get around this problem? Is the only way to use the coordinate system used by armv6,armv7 when testing on the device... (this doesn't work on the simulator)? There is no way I can upload an app for the iPhone and compile only for armv7 (optimized). :(

Was it helpful?

Solution

For finding touches on screen in cocos2d, Try this code. Its working correctly in my game app

- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
    UITouch *myTouch = [touches anyObject];
    CGPoint location = [myTouch locationInView:[myTouch view]];
    location = [[CCDirector sharedDirector] convertToGL:location];
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top