something like this should do the trick:
SELECT
*
FROM
YOUR_TABLE
WHERE
SQRT((input_lat - db_lat) * (input_lat - db_lat) + (input_long - db_long) * (input_long - db_long)) <= input_radius
i used this: Distance between two points
There is only one mising thing : translate the radius to the same units as coordinates
Content of the link (in case it get down)
This small operation calcuates the distance between two points. The routine can work in any number of dimensions, so you cold apply it to 2D or 3D.
In 2D Define your two points. Point 1 at (x1, y1) and Point 2 at (x2, y2).
xd = x2-x1
yd = y2-y1
Distance = SquareRoot(xd*xd + yd*yd)
In 3D Define your two points. Point 1 at (x1, y1, z1) and Point 2 at (x2, y2, z2).
xd = x2-x1
yd = y2-y1
zd = z2-z1
Distance = SquareRoot(xd*xd + yd*yd + zd*zd)
As you can see, this requires that you perform a square root. Square roots should be avoided like the plague if you want to write fast code. Only perform a Square Root if you really need to.
Ways to avoid Square Roots: If you don't need a very accurate distance, you can use a lookup table to calculate it.
If, for example, you are performing collision detection between spheres, and all you want to know is whether or not two have collided, then you do not need to use a square root. Simply change the piece of code
from: if SquareRoot(xdxd + ydyd) < Diameter
to: if (xdxd + ydyd) < (Diameter*Diameter)