Question

I have a table which has a couple of million records in it which relate to locations around the world.

I want to be able to do a search on the table based on the distance from my current location so I've created a function that takes in my latitude and longitude and the latitude and longitude from the database record and returns a distance. I can then use this figure to perform a distance search.

I assume though that in order for the query to work it will have to convert every lat, long record in the database to check whether it falls within my distance search.

I'm using SQL Server 2005

My question is, is there a better way to do this so that every record does not have to be converted?

Was it helpful?

Solution

When I need to do this, I limit my pool of data to check using a "box" around my current data point. For example, in psuedo-SQL:

SELECT DataID, lat, long, distance=fn_getdistance(@ThisLat, lat, @ThisLong, long) FROM MyTable WHERE lat BETWEEN (@ThisLat - @LatDistance) AND (@ThisLat + @LatDistance) AND long BETWEEN (@ThisLong - @LongDistance) AND (@ThisLong + @LongDistance)

Variables:
@ThisLat, @ThisLong: latitude & longitude of your current location
@LatDistance, @LongDistance: the dimensions of the "box" around your current position (play around with different numbers depending on your application and needs)

(Obviously, you'll want an index on the lat & long fields.)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top