Question

I am trying out the CoreMotion teapot sample code from WWDC 2010. Right now, the code considers the device motion, except for the distance of the rendered 3d object and the device. How can I add this?

Example: If I pull the device away from the rendered 3d object, the 3d object must become smaller. If I position the device close to the rendered 3d object, the 3d object must become bigger If I pan to the left, the 3d object must be moved to the right (and possibly, offscreen). If I pan to the right, the 3d object must be moved to the left (and possibly, offscreen).

I don't have an idea on where to start looking. Can this be computed from the device's sensors?

No correct solution

OTHER TIPS

The internal sensors (gyroscope, accelerometer, compass) will struggle with this task because they don't have a fixed reference point in the physical world. There is a question/answer about problems calculating relative position from accelerometers here and here.

Augmented reality applications often solve the problem using the camera to locate a reference point (e.g. a QR code placed on a table or other object tracking) and calculate changes in size / orientation of the reference point to redraw the augmented object.

There is a basic starting point for looking at algorithms and approaches here.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top