문제

I'm using iPhone ARToolkit and I'm wondering how it works.

I want to know how with a destination location, a user location and a compass, this toolkit can know it user is looking to that destination.

How can I know the maths behind this calculations?

도움이 되었습니까?

해결책

The maths that AR ToolKit uses is basic trigonometry. It doesn't use the technique that Thomas describes which I think would be a better approach (apart from step 5. See below)

Overview of the steps involved.

The iPhone's GPS supplies the device's location and you already have the coordinates of the location you want to look at.

First it calculates the difference between the latitude and the longitude values of the two points. These two difference measurements mean you can construct a right-angled triangle and calculate what angle from your current position another given position is. This is the relevant code:

- (float)angleFromCoordinate:(CLLocationCoordinate2D)first toCoordinate:(CLLocationCoordinate2D)second {
    float longitudinalDifference = second.longitude - first.longitude;
    float latitudinalDifference = second.latitude - first.latitude;
    float possibleAzimuth = (M_PI * .5f) - atan(latitudinalDifference / longitudinalDifference);
    if (longitudinalDifference > 0) return possibleAzimuth;
    else if (longitudinalDifference < 0) return possibleAzimuth + M_PI;
    else if (latitudinalDifference < 0) return M_PI;

    return 0.0f;
}

At this point you can then read the compass value from the phone and determine what specific compass angle(azimuth) your device is pointing at. The reading from the compass will be the angle directly in the center of the camera's view. The AR ToolKit then calculates the full range of angle's currently displayed on screen as the iPhone's field of view is known.

In particular it does this by calculating what the angle of the leftmost part of the view is showing:

double leftAzimuth = centerAzimuth - VIEWPORT_WIDTH_RADIANS / 2.0;

if (leftAzimuth < 0.0) {
    leftAzimuth = 2 * M_PI + leftAzimuth;
}

And then calculates the right most:

double rightAzimuth = centerAzimuth + VIEWPORT_WIDTH_RADIANS / 2.0;

if (rightAzimuth > 2 * M_PI) {
    rightAzimuth = rightAzimuth - 2 * M_PI;
}

We now have:

  1. The angle relative to our current position of something we want to display
  2. A range of angles which are currently visible on the screen

This is enough to plot a marker on the screen in the correct position (kind of...see problems section below)

It also does similar calculations related to the devices inclination so if you look at the sky you hopefully won't see a city marker up there and if you point it at your feet you should in theory see cities on the opposite side of the planet. There are problems with these calculation in this toolkit however.

The problems...

Device orientation is not perfect

The value I've just explained the calculation of assumes you're holding the device in an exact position relative to the earth. i.e. perfectly landscape or portrait. Your user probably won't always be doing that. If you tilt the device slightly your horizon line will no longer be horizontal on screen.

The earth is actually 3D!

The earth is 3-dimensional. Few of the calculations in the toolkit account for that. The calculations it performs are only really accurate when you're pointing the device towards the horizon.

For example if you try to plot a point on the opposite side of the globe (directly under your feet) this toolkit behaves very strangely. The approach used to calculate the azimuth range on screen is only valid when looking at the horizon. If you point your camera at the floor you can actually see every single compass point. The toolkit however, thinks you're still only looking at compass reading ± (width of view / 2). If you rotate on the spot you'll see your marker move to edge of the screen, disappear and then reappear on the other side. What you would expect to see is the marker stay on screen as you rotate.

The solution

I've recently implemented an app with AR which I initially hoped AR Toolkit would do the heavy lifting for me. I came across the problems just described which aren't acceptable for my app so had to roll my own.

Thomas' approach is a good method up to point 5 which as I explained above only works when pointing towards the horizon. If you need to plot anything outside of that it breaks down. In my case I have to plot objects that are overhead so it's completely unsuitable.

I addressed this by using OpenGL ES to plot my markers where they actually are in 3D space and move the OpenGL viewport around according to readings from the gyroscope while continuously re-calibrating against the compass. The 3D engine handles all the hard work of determining what's on screen.

Hope that's enough to get you started. I wish I could provide more detail than that but short of posting a lot of hacky code I can't. This approach however did address both problems described above. I hope to open source that part of my code at some point but it's very rough and coupled to my problem domain at the moment.

다른 팁

  • that is all information needed. with iphone-location and destination-location you can calculate the destination-angle (with respect to true north).
  • The only missing thing is to know where the iPhone is currently looking at which is returned by the compass (magnetic north + current location -> true north).

edit: Calculations: (this is just an idea: there may exist a better solution without a lot coordinate-transformations)

  1. convert current and destination location to ecef-coordinates
  2. transform destination ecef coordinate to enu (east, north, up) local coordinate system with current location as reference location. You can also use this.
  3. ignore the height-value and use the enu-coordinate to get the direction: atan2(deast, dnorth)
  4. The compass returns already the angle the iPhone is looking at
  5. display the destination on the screen if
    dest_angle - 10° <= compass_angle <= dest_angle + 10°
    with respect to the cyclic-angle-space. The constant of 10° is just a guessed value. You should either try some values to find out a useful one or you have to analyse some properties of the iPhone-camera.

The coordinate-transformation-equations become much simpler if you assume that the earth is a sphere and not an ellipsoid. Most links if have postet are assuming an wgs-84 ellipsoid becasue gps also does afaik).

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top