Question

I'm trying to find out how could I start to implement sensor fusion on the iPhone. I've started from this talk from David Sachs:

Sensor Fusion on Android Devices

Although David's talk is very illustrative, it doesn't show any code (it makes sense). I've seen both the GLGravity (to extract the gravity vector) and the AccelerometerGraph examples, but I need some help or at least guidance on how to combine the accelerometer, gyroscope and compass inputs so that the result is similar to what David shows.

Thanks

Was it helpful?

Solution

UPDATE: As of May 19, 2015, there is no point in implementing sensor fusion yourself on mobile devices: Both Android (SensorManager under Sensor.TYPE_ROTATION_VECTOR) and iPhone (Core Motion under CMAttitude) offers its own.



(The original answer from May 5, 2011)

I have implemented sensor fusion for Shimmer 2 devices based on this manuscript. I highly recommend it.

Sensor fusion is often achieved by Kalman Filter.

However, there is no such thing as "Kalman Filter for programmers". Kalman filter is difficult to understand. You won't be able to implement and use it correctly if you do not understand it. Just use the above manuscript.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top