Question

I read in many places like: One Screen Deserves Another that: "The sensor coordinate system used by the API for the natural orientation of the device does not change as the device moves, and is the same as the OpenGL coordinate system."

Now, I get the same reading as this image: enter image description here

What I don't understand is: If the coordinate system doesn't change if I rotate the phone (always with the screen facing the user), the gravity force should be applied on Y axis, always. It should only change the axis if I put the phone in a position where the screen is not facing the user anymore like resting on a table, where gravity should be applied on Z axis.

What is wrong with my understanding?

Thanks! Guillermo.

Was it helpful?

Solution

The axis are swapped when the device's screen orientation changes. Per the article you cited:

However, the Android sensor APIs define the sensor coordinate space to be relative to the top and side of the device — not the short and long sides. When the system reorients the screen in response to holding the phone sideways, the sensor coordinate system no longer lines up with the screen’s coordinate system, and you get unexpected rotations in the display of your app.

To access the un-swapped values if you'd like, use indices 3, 4 and 5 in values[], otherwise some of the suggestions mentioned in that same article work quite nicely!

OTHER TIPS

Quite old question, but I find it still relevant. As of today, Sensor Overview page in its Sensor Coordinate System chapter still says:

The most important point to understand about this coordinate system is that the axes are not swapped when the device's screen orientation changes—that is, the sensor's coordinate system never changes as the device moves. This behavior is the same as the behavior of the OpenGL coordinate system.

To me this wording is still confusing, of course it might be because I'm not an English native speaker.

My understanding would be that in Android (as is in iOS) the coordinate system assumed by sensors is integral with the device. That is, the coordinate system is stuck with the device and its axes rotate along with the device.

So, for a phone whose natural orientation is portrait, Y-axis points upward when the phone is held vertically in portrait in front of user. See image below, from same Android guide:

enter image description here

Then when user rotates the phone to landscape left orientation (so with home button on the right side), the Y-axis points to the left. See image below, from a Matlab tutorial (although screen is not really user facing anymore):

enter image description here

Then there's the frequently cited post from Android dev blog, One Screen Turn Deserves Another that says:

The sensor coordinate system used by the API for the natural orientation of the device does not change as the device moves, and is the same as the OpenGL coordinate system

which to me sounds exactly the opposite as my previous reasoning. But then again, in its following So What’s the Problem? chapter, you do see that when phone is rotated to landscape left, Y-axis points to the left as in my previous reasoning.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top