How to control an OpenGL projection matrix with iPhone accelerometer/compass values?
-
26-09-2019 - |
Question
Is there any ready made class/formula somewhere I can use it for control my viewpoint by accelerometer's/compass's XYZ?
I want to achieve the same view control like acrossair uses.
I have the parts (OpenGL space, filtered accel-, compass values, and a cubic panoramic view mapped to a cube around my origin).
Can somebody suggest me where to start at least?
Solution
I've got into the problem since, so the posts about the steps of the solution can be followed here:
xCode - augmented reality at gotoandplay.freeblog.hu
A brief sketch about the whole process below: How to get the transformation matrix from the raw iPhone sensor (accelerometer, magnetometer) data http://gotoandplay.freeblog.hu/files/2010/06/iPhoneDeviceOrientationTransformationMatrix-thumb.jpg
OTHER TIPS
If all you are looking for is a means of rotating the model view matrix for your scene, you could look at the source code to my Molecules application or an even simpler cube example that I wrote for my iPhone development class. Both contain code to incrementally rotate the model view matrix in response to touch input, so you would just need to replace the touch input with accelerometer values.
Additionally, Apple's GLGravity sample application does something very similar to what you want.