Question

I've got Core Motion manager in my iOS app:

motionManager = [[CMMotionManager alloc] init];
motionManager.deviceMotionUpdateInterval = 1.0 / 60.0;
if ([motionManager isDeviceMotionAvailable]) {
    [motionManager startDeviceMotionUpdates];
}

and in update method (I'm using cocos3d, but it does not matter) I've got this:

-(void) updateBeforeTransform:(CC3NodeUpdatingVisitor *)visitor
{
    if (motionManager.deviceMotionActive)
    {
        CMDeviceMotion *deviceMotion = motionManager.deviceMotion;
        CMAttitude *attitude = deviceMotion.attitude;

        NSLog(@"%f, %f, %f", attitude.yaw, attitude.pitch, attitude.roll);

    }
}

I put device on table and start to watch yaw, pitch and roll values, and the yaw is constantly changing! it is changing for about 10 degrees in couple of minutes, and it is absolutely inadmissibly in my app. What is the reason of that changing and how can I avoid it? I started to think that it happens because of Earth rotation, but the speed is too much :)

Thanks in advance!

Was it helpful?

Solution

Let me try a shot in the dark. With iOS 5 magnetometer data is part of the Core Motion sensor fusion algorithm. For many applications like games there is no need or better is's dangerous because of increased energy consumption and the possible need to calibrate the compass forcing the user to do this 8 like motion.

So I speculate that compass data is considered within sensor fusion only if explicitly stated by using CMMotionManager's startDeviceMotionUpdatesUsingReferenceFrame instead of startDeviceMotionUpdates. Try out CMAttitudeReferenceFrameXMagneticNorthZVertical and check if the drifting effect decreases.

OTHER TIPS

What you are experiencing is something called drift and you can't do much about it.

Basically, the gyroscope is very good at measuring rotational rates but it cannot measure instantaneous orientations. Therefore, in order to calculate the current orientation of the device, a sensor algorithm must integrate sensed rates into positions. However, as positions are calculated, small errors build up over time and the computed orientation may drift, even if the device remains mostly still.

If a device happens to have a sensor that can measure instantaneous orientations, such as a magnetometer, then a sensor fusion algorithm can correct the drift by comparing / combining sensor inputs, hence Apple's reference frame option: CMAttitudeReferenceFrameXArbitraryCorrectedZVertical.

But Apple's implementation isn't perfect, that's why you see the massive jumps back and forth to correct build up of error when CMAttitudeReferenceFrameXArbitraryCorrectedZVertical is enabled. A better algorithm might be one that at least smooths out the error correction over time.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top