I am working on an app that requires the user to aim his iPhone to the sun in order to trigger a special event.

I can retrieve the device 3D orientation quaternion based on the gyroscope and CoreMotion framework, from this I can get the yaw, roll and pitch angles. I can also compute the sun's azimuth and zenith angle based on the current date and time (GMT) and the latitude and longitude. What I am trying to figure out next is how to compare these two sets of values (phone orientation and sun position) to accurately detect the alignment of the device with the sun.

Any ideas on how to achieve that?

有帮助吗?

解决方案

I finally managed to solve the problem,

To achieve this I used a sample code using augmented reality available on Apple developer ressources: pARk

The idea was first to convert the sun's spherical coordinates to cartesian coordinates in order to get its position in the sky as a simple vector: {x, y, z}. The formula is available on Wikipedia : Spherical coordinates. As the distance from the sun (radius in spherical coordinates) doesn't really matter here, I used 1.

Using the device gyroscope and CoreMotion framework I was then able to get the iPhone rotation matrix. From the 'pARk' code sample and using the geometry functions I could then compute the camera projection matrix. I multiply the rotation matrix with it and then multiply the resulting matrix with the sun's vector coordinates.

And that gives a simple vector with screen coordinates for the sun. By displaying a UIView with these x and y values I can finally see the sun moving around as I move the screen.

The code for this feature is available on my Github, feel free to use and share!

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top