I am using geopy to calculate the first point where a straight line intersects a circle on a map, this code works perfectly as expected but I believe is quite inefficient as it has to compute the lat/lng of every point of the line until it reaches the circle radius.
Is there a more efficient way to do this with geopy or other similar libraries?
In the below example, the line starts at 51.137933, -0.267017 and moves in a straight direction of 257 degrees until it gets within 10.5 nautical miles of the circle center 51.053953, -0.625003
import math
from geopy.distance import distance as dist
# Starting position in degrees
lat1 = 51.137933
lon1 = -0.267017
target_lat = 51.053953
target_lon = -0.625003
# Direction in radians
direction = math.radians(257)
# Earth's radius in miles
R = 3963.1676
# Distance to move in miles
distance = 0.1
d = dist((lat1, lon1), (target_lat, target_lon)).nautical
last_d = d
while d > 10.5:
# Convert latitude and longitude to radians
lat1 = math.radians(lat1)
lon1 = math.radians(lon1)
# Calculate new latitude and longitude
lat2 = math.asin(math.sin(lat1) * math.cos(distance / R) + math.cos(lat1) * math.sin(distance / R) * math.cos(direction))
lon2 = lon1 + math.atan2(math.sin(direction) * math.sin(distance / R) * math.cos(lat1), math.cos(distance / R) - math.sin(lat1) * math.sin(lat2))
# Convert latitude and longitude back to degrees
lat1 = math.degrees(lat2)
lon1 = math.degrees(lon2)
d = dist((lat1, lon1), (target_lat, target_lon)).nautical
if d > last_d:
print("NOT FOUND")
break
print(lat1, lon1)
not sure I fully understand your problem. but in a carthesian coordinate system, where you have 2 points a and b, you obtain a directional vector from point a to point b as c = (b-a). now when you scale/normalize c to length 1 (that is, divide by its length) you should obtain the intersection with a circle of radius d around a as a + d * c...this should work for polar coordinates too i guess