문제

What is the correct way to use CMAttitude:multiplyByInverseOfAttitude?

Assuming an iOS5 device laying flat on a table, after starting CMMotionManager with:

CMMotionManager *motionManager = [[CMMotionManager alloc]init];
[motionManager startDeviceMotionUpdatesUsingReferenceFrame:
    CMAttitudeReferenceFrameXTrueNorthZVertical];

Later, CMDeviceMotion objects are obtained:

CMDeviceMotion *deviceMotion = [motionManager deviceMotion];

I expect that [deviceMotion attitude] reflects the rotation of the device from True North.

By observation, [deviceMotion userAcceleration] reports acceleration in the device reference frame. That is, moving the device side to side (keeping it flat on the table) registers acceleration in the x-axis. Turning the device 90° (still flat) and moving the device side to side still reports x acceleration.

What is the correct way to transform [deviceMotion userAcceleration] to obtain North-South/East-West acceleration rather than left-right/forward-backward?

CMAttitude multiplyByInverseOfAttitude seems unnecessary since a reference frame has already been specified and it is unclear from the documentation how to apply the attitude to CMAcceleration.

도움이 되었습니까?

해결책

The question would not have arisen if CMDeviceMotion had an accessor for the userAcceleration in coordinates of the reference frame. So, I used a category to add the required method:

In CMDeviceMotion+TransformToReferenceFrame.h:

#import <CoreMotion/CoreMotion.h>

@interface CMDeviceMotion (TransformToReferenceFrame)
-(CMAcceleration)userAccelerationInReferenceFrame;
@end

and in CMDeviceMotion+TransformToReferenceFrame.m:

#import "CMDeviceMotion+TransformToReferenceFrame.h"

@implementation CMDeviceMotion (TransformToReferenceFrame)

-(CMAcceleration)userAccelerationInReferenceFrame
{
    CMAcceleration acc = [self userAcceleration];
    CMRotationMatrix rot = [self attitude].rotationMatrix;

    CMAcceleration accRef;
    accRef.x = acc.x*rot.m11 + acc.y*rot.m12 + acc.z*rot.m13;
    accRef.y = acc.x*rot.m21 + acc.y*rot.m22 + acc.z*rot.m23;
    accRef.z = acc.x*rot.m31 + acc.y*rot.m32 + acc.z*rot.m33;

    return accRef;
}

@end

and in Swift 3

extension CMDeviceMotion {

    var userAccelerationInReferenceFrame: CMAcceleration {
        let acc = self.userAcceleration
        let rot = self.attitude.rotationMatrix

        var accRef = CMAcceleration()
        accRef.x = acc.x*rot.m11 + acc.y*rot.m12 + acc.z*rot.m13;
        accRef.y = acc.x*rot.m21 + acc.y*rot.m22 + acc.z*rot.m23;
        accRef.z = acc.x*rot.m31 + acc.y*rot.m32 + acc.z*rot.m33;

        return accRef;
    }
}

Now, code that previously used [deviceMotion userAcceleration] can use [deviceMotion userAccelerationInReferenceFrame] instead.

다른 팁

i tried to implement a solution after reading the paper linked above.

Steps are the follows:

  • take the attitude rotation matrix every update time.
  • comput the inverse matrix.
  • multiplying the inverse matrix for the UserAcceleration vector.

the resultant vector will be the projection of the vector.

-x north, +x south

-y east, +y weast

my code it's not perfect yet, i'm working on it.

According to Apple's Documentation, CMAttitude refers to the orientation of a body relative to a given frame of reference. And either userAcceleration or gravity is the value of the device's frame. So in order to get the value of reference frame. We should do as @Batti said

  1. take the attitude rotation matrix every update time.
  2. comput the inverse matrix.
  3. multiplying the inverse matrix for the UserAcceleration vector.

Here's the Swift version

import CoreMotion
import GLKit

extension CMDeviceMotion {

    func userAccelerationInReferenceFrame() -> CMAcceleration {

        let origin = userAcceleration
        let rotation = attitude.rotationMatrix
        let matrix = rotation.inverse()

        var result = CMAcceleration()
        result.x = origin.x * matrix.m11 + origin.y * matrix.m12 + origin.z * matrix.m13;
        result.y = origin.x * matrix.m21 + origin.y * matrix.m22 + origin.z * matrix.m23;
        result.z = origin.x * matrix.m31 + origin.y * matrix.m32 + origin.z * matrix.m33;

        return result
    }

    func gravityInReferenceFrame() -> CMAcceleration {

        let origin = self.gravity
        let rotation = attitude.rotationMatrix
        let matrix = rotation.inverse()

        var result = CMAcceleration()
        result.x = origin.x * matrix.m11 + origin.y * matrix.m12 + origin.z * matrix.m13;
        result.y = origin.x * matrix.m21 + origin.y * matrix.m22 + origin.z * matrix.m23;
        result.z = origin.x * matrix.m31 + origin.y * matrix.m32 + origin.z * matrix.m33;

        return result
    }
}

extension CMRotationMatrix {

    func inverse() -> CMRotationMatrix {

        let matrix = GLKMatrix3Make(Float(m11), Float(m12), Float(m13), Float(m21), Float(m22), Float(m23), Float(m31), Float(m32), Float(m33))
        let invert = GLKMatrix3Invert(matrix, nil)

        return CMRotationMatrix(m11: Double(invert.m00), m12: Double(invert.m01), m13: Double(invert.m02),
                            m21: Double(invert.m10), m22: Double(invert.m11), m23: Double(invert.m12),
                            m31: Double(invert.m20), m32: Double(invert.m21), m33: Double(invert.m22))

    }

}

Hope it helps a little bit

The reference frame is related to the attitude value, look the value of attitude of yaw angle; If you don't use a reference frame, when you start your app, this value is always zero, instead if you use the reference frame CMAttitudeReferenceFrameXTrueNorthZVertical the yaw value indicates the angle between the x-axis and true north. with this information you can identify the attitude of phone in the coordinates of the earth and therefore the position of axes of the accelerometer with respect to the cardinal points.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top