From 5233363911e1df729ded007a3f38b6353955dc62 Mon Sep 17 00:00:00 2001 From: Maciej Lisiewski Date: Wed, 31 Mar 2021 19:41:23 -0400 Subject: [PATCH 1/3] Use same kind of line breaks (\n) in all files --- README.md | 192 ++++++++++---------- main.py | 516 +++++++++++++++++++++++++++--------------------------- 2 files changed, 354 insertions(+), 354 deletions(-) diff --git a/README.md b/README.md index 242881b..dbf5b68 100644 --- a/README.md +++ b/README.md @@ -1,96 +1,96 @@ -# Goals: - -1. % of day covered -2. Average coverage - factoring the number of satellites covering a location which would be more useful for approximating average bandwith available for an area - -## Original Implementation - -1. TLE to simulate satellites -2. Altitude,x,y to conical section on sphere - 1. Adjustable minimum elevation - 2. Swap out for more accurate model of earth (WGS-84 ellipsoid) -3. Map conical section to equi-rectangular area -4. Translate to pixel coords with resolution of about 100sq mi -5. Normalize? Overlay - -For the core there is this naive implementation: - -``` -for each time step: - for each pixel: - convert to lat/long - for each sat: - if visible - increase counter for pixel -``` - -This seems like a pretty gross way to do it. We touch every pixel \* numSats, when we could reduce to -numSats \* visible area. However, the math to calculate visible area for a satellite doesn't seem to -be a part of any library I've found. This is even more relevant with VLEO satellites since they cover -so much less area. I'm not sure how efficient it is to go from sphere to lat/long -on an equi-rectangular projection. - -## Resolution - -2 options for spatial resolution in my head originally: -- 2048 x 1024 = 2,097,152 -- 4096 x 2048 = 8,388,608‬ - -Instead of trying to use a 2d array to represent the footprints and dealing with the spherical nature -of the Earth, I am instead going to try and use S2 level 9 cells to track coverage. According to the -statistics page of the S2 library, this is about 1537000 cells, so I would need about 12MB of data to -track the whole Earth. Each cell covers 324.29km^2 on average. - -For temporal resolution: -These satellites move across a location quite quickly, using a website like https://findstarlink.com -we can find that a whole "train" of Starlink satellites crosses a locations view in about 4 or 5 minutes -so our temporal resolution needs to be a minute if not faster if we want accurate results. - -## S2 based approach -The original implementation has lots of gross properties and requires dealing with many projections. -So I tried using a new approach that used ~ equal area tesselated grids. The one I knew about was S2 so I tried that first. -``` -for each time step: - for each sat: - calculate footprint - get cells to cover footprint - for each cell: - increment counter -``` - -This means we only have to iterate over ~2000 (+/-600?) cells per satellite instead of ~2 million -points each. However, this approach seems too slow. Comptuing all the covering cells of each satellite -takes about 90s on a single core in Python. While this could be parallelized, I think maybe there -are even smarter approaches to consider. Could we maybe just store S2Cap objects and test points -against them to count the amount of coverage? What I really like from S2 is that it handles the -issues of latitude and longitude meaning different distances at the equator vs the poles. - -In the end for plotting, if we used the cell vertices as plotting locations we would cover the whole -globe and not oversample at the poles or undersample at the equator. - -## S2, more like 2Slow... An H3 approach -So S2 was just being way to slow to return cells for a given covering. (90s to simulate one timestep) -Now I have to make some guesses about projections for H3 and but it is ~4.5 times faster to simulate -a timestep than S2 (now takes 20 seconds on my machine). I'm gonna proceed with the H3 approach for -now. - -## Used constants - -- Earth Mean Equitorial Radius = 6,378.1km -- Minimum angle for user terminals = 35deg - -(It has recently come to my attention that there may be newer data from SpaceX that the minimum user terminal angle would be 25 degrees, so my data may be more conservative. The data I used is from the FCC document linked below.) - -## Starlink details - -From the second reference below I found that Starlink is targeting user terminal minimum angles of 35deg -This means that each satellite covers around 600,000km^2 at an altitude of about 340km. However, most of -the starlink satellites already in space are elevating or are at 550 km - -Stralink TLE's -https://celestrak.com/NORAD/elements/starlink.txt - -### References - -- https://arxiv.org/pdf/1906.12318.pdf -- https://licensing.fcc.gov/myibfs/download.do?attachment_key=1190019 +# Goals: + +1. % of day covered +2. Average coverage - factoring the number of satellites covering a location which would be more useful for approximating average bandwith available for an area + +## Original Implementation + +1. TLE to simulate satellites +2. Altitude,x,y to conical section on sphere + 1. Adjustable minimum elevation + 2. Swap out for more accurate model of earth (WGS-84 ellipsoid) +3. Map conical section to equi-rectangular area +4. Translate to pixel coords with resolution of about 100sq mi +5. Normalize? Overlay + +For the core there is this naive implementation: + +``` +for each time step: + for each pixel: + convert to lat/long + for each sat: + if visible + increase counter for pixel +``` + +This seems like a pretty gross way to do it. We touch every pixel \* numSats, when we could reduce to +numSats \* visible area. However, the math to calculate visible area for a satellite doesn't seem to +be a part of any library I've found. This is even more relevant with VLEO satellites since they cover +so much less area. I'm not sure how efficient it is to go from sphere to lat/long +on an equi-rectangular projection. + +## Resolution + +2 options for spatial resolution in my head originally: +- 2048 x 1024 = 2,097,152 +- 4096 x 2048 = 8,388,608‬ + +Instead of trying to use a 2d array to represent the footprints and dealing with the spherical nature +of the Earth, I am instead going to try and use S2 level 9 cells to track coverage. According to the +statistics page of the S2 library, this is about 1537000 cells, so I would need about 12MB of data to +track the whole Earth. Each cell covers 324.29km^2 on average. + +For temporal resolution: +These satellites move across a location quite quickly, using a website like https://findstarlink.com +we can find that a whole "train" of Starlink satellites crosses a locations view in about 4 or 5 minutes +so our temporal resolution needs to be a minute if not faster if we want accurate results. + +## S2 based approach +The original implementation has lots of gross properties and requires dealing with many projections. +So I tried using a new approach that used ~ equal area tesselated grids. The one I knew about was S2 so I tried that first. +``` +for each time step: + for each sat: + calculate footprint + get cells to cover footprint + for each cell: + increment counter +``` + +This means we only have to iterate over ~2000 (+/-600?) cells per satellite instead of ~2 million +points each. However, this approach seems too slow. Comptuing all the covering cells of each satellite +takes about 90s on a single core in Python. While this could be parallelized, I think maybe there +are even smarter approaches to consider. Could we maybe just store S2Cap objects and test points +against them to count the amount of coverage? What I really like from S2 is that it handles the +issues of latitude and longitude meaning different distances at the equator vs the poles. + +In the end for plotting, if we used the cell vertices as plotting locations we would cover the whole +globe and not oversample at the poles or undersample at the equator. + +## S2, more like 2Slow... An H3 approach +So S2 was just being way to slow to return cells for a given covering. (90s to simulate one timestep) +Now I have to make some guesses about projections for H3 and but it is ~4.5 times faster to simulate +a timestep than S2 (now takes 20 seconds on my machine). I'm gonna proceed with the H3 approach for +now. + +## Used constants + +- Earth Mean Equitorial Radius = 6,378.1km +- Minimum angle for user terminals = 35deg + +(It has recently come to my attention that there may be newer data from SpaceX that the minimum user terminal angle would be 25 degrees, so my data may be more conservative. The data I used is from the FCC document linked below.) + +## Starlink details + +From the second reference below I found that Starlink is targeting user terminal minimum angles of 35deg +This means that each satellite covers around 600,000km^2 at an altitude of about 340km. However, most of +the starlink satellites already in space are elevating or are at 550 km + +Stralink TLE's +https://celestrak.com/NORAD/elements/starlink.txt + +### References + +- https://arxiv.org/pdf/1906.12318.pdf +- https://licensing.fcc.gov/myibfs/download.do?attachment_key=1190019 diff --git a/main.py b/main.py index f74d3b9..1398291 100644 --- a/main.py +++ b/main.py @@ -1,258 +1,258 @@ -#!/usr/bin/env python3 - -from skyfield import api -from skyfield.api import Loader - -import sys -import math -import json -import itertools -from collections import defaultdict - -from typing import List, Dict, DefaultDict, Set, Tuple - -import s2sphere - -import numpy as np - -import shapely.geometry -from shapely.geometry import Polygon -import geog -import h3 - -import requests -import csv -from datetime import datetime - -# optional for debugging -import debug_plot - -TLE_URL = 'https://celestrak.com/NORAD/elements/starlink.txt' -R_MEAN = 6378.1 # km -H3_RESOLUTION_LEVEL = 4 -MIN_TERMINAL_ANGLE_DEG = 35 - - -def to_deg(radians: float) -> float: - return radians * (180 / math.pi) - - -def to_rads(degrees: float) -> float: - return degrees * (math.pi / 180) - - -RIGHT_ANGLE = to_rads(90) - - -def load_sats() -> List: - load = Loader('./tle_cache') - sats = load.tle_file(url=TLE_URL) - return sats - - -def filter_sats(sats: List) -> List: - """From https://www.space-track.org/documentation#faq filter to our best - understanding of what the operational satellites are""" - op_sats: List = [] - failures_url = "https://docs.google.com/spreadsheets/d/1mTPX5JSkeaoViGT_1wigrjwjzIVkpzI3xhFpEm909oM/gviz/tq?gid=71799984&tqx=out:csv" - r = requests.get(failures_url) - nonoperational = set() - for row in csv.DictReader(r.iter_lines(decode_unicode=True)): - name = row['NAME'].strip() - event_date = datetime.strptime(row['DATE'], "%m/%d/%Y").date() - event = row['EVENT'].strip() - nonoperational.add(name) - for sat in sats: - n = (sat.model.no_kozai / (2 * math.pi)) * 1440 - e = sat.model.ecco - period = 1440/n - mu = 398600.4418 # earth grav constant - a = (mu/(n*2*math.pi/(24*3600)) ** 2) ** (1./3.) # semi-major axis - # Using semi-major axis "a", eccentricity "e", and the Earth's radius in km, - perigee = (a * (1 - e)) - 6378.135 - if perigee > 540 and sat.name not in nonoperational: - op_sats.append(sat) - - return op_sats - - -def calcAreaSpherical(altitude: float, term_angle: float) -> float: - """Calculates the area of a Starlink satellite using the - spherical earth model, a satellite (for altitude), and - minimum terminal angle (elevation angle) - """ - epsilon = to_rads(term_angle) - - eta_FOV = math.asin((math.sin(epsilon + RIGHT_ANGLE) - * R_MEAN) / (R_MEAN + altitude)) - - lambda_FOV = 2 * (math.pi - (epsilon + RIGHT_ANGLE + eta_FOV)) - - area = 2 * math.pi * (R_MEAN ** 2) * (1 - math.cos(lambda_FOV / 2)) - - return area - - -def calcCapAngle(altitude: float, term_angle: float) -> float: - """Returns the cap angle (lambda_FOV/2) in radians""" - epsilon = to_rads(term_angle) - - eta_FOV = math.asin((math.sin(epsilon + RIGHT_ANGLE) - * R_MEAN) / (R_MEAN + altitude)) - - lambda_FOV = 2 * (math.pi - (epsilon + RIGHT_ANGLE + eta_FOV)) - - return (lambda_FOV / 2) - - -def get_cell_ids(lat, lng, angle): - """angle in degrees, is theta (the cap opening angle)""" - region = s2sphere.Cap.from_axis_angle(s2sphere.LatLng.from_degrees( - lat, lng).to_point(), s2sphere.Angle.from_radians(angle)) - coverer = s2sphere.RegionCoverer() - coverer.min_level = 9 - coverer.max_level = 9 - cells = coverer.get_covering(region) - return cells - - -def split_antimeridian_polygon(polygon: List[List[float]]) -> Tuple[List, List]: - """Takes a GeoJSON formatted list of vertex coordinates List[[lon,lat]] - and checks if the longitude crosses over the antimeridian. It splits the polygon - in 2 at the antimeridian. See: https://github.com/uber/h3/issues/210 - """ - - # We split the polygon into 2. lon < 180 goes into poly1 - poly1, poly2 = [], [] - split_loc = 0.0 - has_split = False - for idx in range(len(polygon)-1): - first = polygon[idx] - second = polygon[idx+1] - if (abs(first[0]) < 180 and abs(second[0]) > 180) or (abs(first[0]) > 180 and abs(second[0]) < 180): - split_loc = math.copysign(180.0, first[0]) - # split - new_lat = np.interp([split_loc], np.array([first[0], second[0]]), - np.array([first[1], second[1]]))[0] - new_point = [split_loc, float(new_lat)] - poly1.append([split_loc, float(new_lat)]) - poly2.append([split_loc, float(new_lat)]) - elif abs(first[0]) < 180: - poly1.append(first) - else: - poly2.append(first) - # GeoJSON polygons must have the same point for the first and last vertex - poly1.append(poly1[0]) - poly2.append(poly2[0]) - # h3 doesn't like |longitude| > 180 so make them the negative or positive equivalent - poly2_wrapped: List[List[float]] = [] - for point in poly2: - if split_loc > 0: - poly2_wrapped.append([-180 + (point[0]-180), point[1]]) - else: - poly2_wrapped.append([180 + (point[0]+180), point[1]]) - mapping1 = shapely.geometry.mapping(shapely.geometry.Polygon(poly1)) - mapping2 = shapely.geometry.mapping( - shapely.geometry.polygon.orient(shapely.geometry.Polygon(poly2_wrapped), 1.0)) - - return (mapping1, mapping2) - - -def get_cell_ids_h3(lat: float, lng: float, angle: float) -> Set: - p = shapely.geometry.Point([lng, lat]) - # so to more accurately match projections maybe arc length of a sphere would be best? - arc_length = R_MEAN * angle # in km - - n_points = 20 - # arc_length should be in kilometers so convert to meters - d = arc_length * 1000 # meters - angles = np.linspace(0, 360, n_points) - polygon = geog.propagate(p, angles, d) - try: - mapping = shapely.geometry.mapping(shapely.geometry.Polygon(polygon)) - except ValueError as e: - print(f"lat:{lat}, lng:{lng}") - print(polygon) - - cells = set() - needs_split = False - - for point in polygon: - if point[0] > 180 or point[0] < -180: - needs_split = True - break - - if needs_split: - try: - (first, second) = split_antimeridian_polygon(polygon) - cells.update(h3.polyfill(first, H3_RESOLUTION_LEVEL, True)) - cells.update(h3.polyfill(second, H3_RESOLUTION_LEVEL, True)) - except: - print(f"lat:{lat}, lng:{lng}") - else: - cells = h3.polyfill(mapping, H3_RESOLUTION_LEVEL, True) - - return cells - - -sats = load_sats() -print(f"Loaded {len(sats)} satellites") - -op_sats = filter_sats(sats) -print(f"Filtered to {len(op_sats)} operational satellites") -ts = api.load.timescale() -now = ts.now() - -subpoints = {sat.name: sat.at(now).subpoint() for sat in op_sats} - -sat1 = subpoints['STARLINK-1284'] -angle = calcCapAngle(sat1.elevation.km, 35) - -coverage: DefaultDict[str, int] = defaultdict(int) - - -def readTokens(): - with open('cell_ids.txt', 'r') as fd: - lines = fd.readlines() - for line in lines: - tok = line.strip() - coverage[tok] = 0 - - -def readH3Indices() -> List[str]: - with open('h3_5_index.txt', 'r') as fd: - lines = [line.strip() for line in fd.readlines()] - return lines - - -if __name__ == "__main__": - if len(sys.argv) > 1: - process: int = int(sys.argv[1]) - else: - process = 0 - if len(sys.argv) > 2: - MIN_TERMINAL_ANGLE_DEG = int(sys.argv[2]) - - TIME_PER_PROCESS = 1440 // 4 # 360 minutes, a quarter of a day - START_TIME = process * TIME_PER_PROCESS - # Consider swapping the loop to be sats then time for cache reasons? - for i in range(TIME_PER_PROCESS): - time = ts.utc(2020, 7, 25, 0, START_TIME+i, 0) - if i % 30 == 0: - print(time.utc_iso()) - subpoints = {sat.name: sat.at(time).subpoint() for sat in op_sats} - coverage_set: Set[str] = set() - for sat_name, sat in subpoints.items(): - angle = calcCapAngle(sat.elevation.km, MIN_TERMINAL_ANGLE_DEG) - cells = get_cell_ids_h3(sat.latitude.degrees, - sat.longitude.degrees, angle) - if len(cells) == 0: - Exception("empty region returned") - for cell in cells: - coverage_set.add(cell) - for cell in coverage_set: - coverage[cell] += 1 - - with open(f"h3_{H3_RESOLUTION_LEVEL}_cov_{process}.txt", "w") as fd: - for cell, cov in coverage.items(): - fd.write(f"{cell},{cov}\n") +#!/usr/bin/env python3 + +from skyfield import api +from skyfield.api import Loader + +import sys +import math +import json +import itertools +from collections import defaultdict + +from typing import List, Dict, DefaultDict, Set, Tuple + +import s2sphere + +import numpy as np + +import shapely.geometry +from shapely.geometry import Polygon +import geog +import h3 + +import requests +import csv +from datetime import datetime + +# optional for debugging +import debug_plot + +TLE_URL = 'https://celestrak.com/NORAD/elements/starlink.txt' +R_MEAN = 6378.1 # km +H3_RESOLUTION_LEVEL = 4 +MIN_TERMINAL_ANGLE_DEG = 35 + + +def to_deg(radians: float) -> float: + return radians * (180 / math.pi) + + +def to_rads(degrees: float) -> float: + return degrees * (math.pi / 180) + + +RIGHT_ANGLE = to_rads(90) + + +def load_sats() -> List: + load = Loader('./tle_cache') + sats = load.tle_file(url=TLE_URL) + return sats + + +def filter_sats(sats: List) -> List: + """From https://www.space-track.org/documentation#faq filter to our best + understanding of what the operational satellites are""" + op_sats: List = [] + failures_url = "https://docs.google.com/spreadsheets/d/1mTPX5JSkeaoViGT_1wigrjwjzIVkpzI3xhFpEm909oM/gviz/tq?gid=71799984&tqx=out:csv" + r = requests.get(failures_url) + nonoperational = set() + for row in csv.DictReader(r.iter_lines(decode_unicode=True)): + name = row['NAME'].strip() + event_date = datetime.strptime(row['DATE'], "%m/%d/%Y").date() + event = row['EVENT'].strip() + nonoperational.add(name) + for sat in sats: + n = (sat.model.no_kozai / (2 * math.pi)) * 1440 + e = sat.model.ecco + period = 1440/n + mu = 398600.4418 # earth grav constant + a = (mu/(n*2*math.pi/(24*3600)) ** 2) ** (1./3.) # semi-major axis + # Using semi-major axis "a", eccentricity "e", and the Earth's radius in km, + perigee = (a * (1 - e)) - 6378.135 + if perigee > 540 and sat.name not in nonoperational: + op_sats.append(sat) + + return op_sats + + +def calcAreaSpherical(altitude: float, term_angle: float) -> float: + """Calculates the area of a Starlink satellite using the + spherical earth model, a satellite (for altitude), and + minimum terminal angle (elevation angle) + """ + epsilon = to_rads(term_angle) + + eta_FOV = math.asin((math.sin(epsilon + RIGHT_ANGLE) + * R_MEAN) / (R_MEAN + altitude)) + + lambda_FOV = 2 * (math.pi - (epsilon + RIGHT_ANGLE + eta_FOV)) + + area = 2 * math.pi * (R_MEAN ** 2) * (1 - math.cos(lambda_FOV / 2)) + + return area + + +def calcCapAngle(altitude: float, term_angle: float) -> float: + """Returns the cap angle (lambda_FOV/2) in radians""" + epsilon = to_rads(term_angle) + + eta_FOV = math.asin((math.sin(epsilon + RIGHT_ANGLE) + * R_MEAN) / (R_MEAN + altitude)) + + lambda_FOV = 2 * (math.pi - (epsilon + RIGHT_ANGLE + eta_FOV)) + + return (lambda_FOV / 2) + + +def get_cell_ids(lat, lng, angle): + """angle in degrees, is theta (the cap opening angle)""" + region = s2sphere.Cap.from_axis_angle(s2sphere.LatLng.from_degrees( + lat, lng).to_point(), s2sphere.Angle.from_radians(angle)) + coverer = s2sphere.RegionCoverer() + coverer.min_level = 9 + coverer.max_level = 9 + cells = coverer.get_covering(region) + return cells + + +def split_antimeridian_polygon(polygon: List[List[float]]) -> Tuple[List, List]: + """Takes a GeoJSON formatted list of vertex coordinates List[[lon,lat]] + and checks if the longitude crosses over the antimeridian. It splits the polygon + in 2 at the antimeridian. See: https://github.com/uber/h3/issues/210 + """ + + # We split the polygon into 2. lon < 180 goes into poly1 + poly1, poly2 = [], [] + split_loc = 0.0 + has_split = False + for idx in range(len(polygon)-1): + first = polygon[idx] + second = polygon[idx+1] + if (abs(first[0]) < 180 and abs(second[0]) > 180) or (abs(first[0]) > 180 and abs(second[0]) < 180): + split_loc = math.copysign(180.0, first[0]) + # split + new_lat = np.interp([split_loc], np.array([first[0], second[0]]), + np.array([first[1], second[1]]))[0] + new_point = [split_loc, float(new_lat)] + poly1.append([split_loc, float(new_lat)]) + poly2.append([split_loc, float(new_lat)]) + elif abs(first[0]) < 180: + poly1.append(first) + else: + poly2.append(first) + # GeoJSON polygons must have the same point for the first and last vertex + poly1.append(poly1[0]) + poly2.append(poly2[0]) + # h3 doesn't like |longitude| > 180 so make them the negative or positive equivalent + poly2_wrapped: List[List[float]] = [] + for point in poly2: + if split_loc > 0: + poly2_wrapped.append([-180 + (point[0]-180), point[1]]) + else: + poly2_wrapped.append([180 + (point[0]+180), point[1]]) + mapping1 = shapely.geometry.mapping(shapely.geometry.Polygon(poly1)) + mapping2 = shapely.geometry.mapping( + shapely.geometry.polygon.orient(shapely.geometry.Polygon(poly2_wrapped), 1.0)) + + return (mapping1, mapping2) + + +def get_cell_ids_h3(lat: float, lng: float, angle: float) -> Set: + p = shapely.geometry.Point([lng, lat]) + # so to more accurately match projections maybe arc length of a sphere would be best? + arc_length = R_MEAN * angle # in km + + n_points = 20 + # arc_length should be in kilometers so convert to meters + d = arc_length * 1000 # meters + angles = np.linspace(0, 360, n_points) + polygon = geog.propagate(p, angles, d) + try: + mapping = shapely.geometry.mapping(shapely.geometry.Polygon(polygon)) + except ValueError as e: + print(f"lat:{lat}, lng:{lng}") + print(polygon) + + cells = set() + needs_split = False + + for point in polygon: + if point[0] > 180 or point[0] < -180: + needs_split = True + break + + if needs_split: + try: + (first, second) = split_antimeridian_polygon(polygon) + cells.update(h3.polyfill(first, H3_RESOLUTION_LEVEL, True)) + cells.update(h3.polyfill(second, H3_RESOLUTION_LEVEL, True)) + except: + print(f"lat:{lat}, lng:{lng}") + else: + cells = h3.polyfill(mapping, H3_RESOLUTION_LEVEL, True) + + return cells + + +sats = load_sats() +print(f"Loaded {len(sats)} satellites") + +op_sats = filter_sats(sats) +print(f"Filtered to {len(op_sats)} operational satellites") +ts = api.load.timescale() +now = ts.now() + +subpoints = {sat.name: sat.at(now).subpoint() for sat in op_sats} + +sat1 = subpoints['STARLINK-1284'] +angle = calcCapAngle(sat1.elevation.km, 35) + +coverage: DefaultDict[str, int] = defaultdict(int) + + +def readTokens(): + with open('cell_ids.txt', 'r') as fd: + lines = fd.readlines() + for line in lines: + tok = line.strip() + coverage[tok] = 0 + + +def readH3Indices() -> List[str]: + with open('h3_5_index.txt', 'r') as fd: + lines = [line.strip() for line in fd.readlines()] + return lines + + +if __name__ == "__main__": + if len(sys.argv) > 1: + process: int = int(sys.argv[1]) + else: + process = 0 + if len(sys.argv) > 2: + MIN_TERMINAL_ANGLE_DEG = int(sys.argv[2]) + + TIME_PER_PROCESS = 1440 // 4 # 360 minutes, a quarter of a day + START_TIME = process * TIME_PER_PROCESS + # Consider swapping the loop to be sats then time for cache reasons? + for i in range(TIME_PER_PROCESS): + time = ts.utc(2020, 7, 25, 0, START_TIME+i, 0) + if i % 30 == 0: + print(time.utc_iso()) + subpoints = {sat.name: sat.at(time).subpoint() for sat in op_sats} + coverage_set: Set[str] = set() + for sat_name, sat in subpoints.items(): + angle = calcCapAngle(sat.elevation.km, MIN_TERMINAL_ANGLE_DEG) + cells = get_cell_ids_h3(sat.latitude.degrees, + sat.longitude.degrees, angle) + if len(cells) == 0: + Exception("empty region returned") + for cell in cells: + coverage_set.add(cell) + for cell in coverage_set: + coverage[cell] += 1 + + with open(f"h3_{H3_RESOLUTION_LEVEL}_cov_{process}.txt", "w") as fd: + for cell, cov in coverage.items(): + fd.write(f"{cell},{cov}\n") From a4b80a766c1af1036c1cf6280f155c1a54986824 Mon Sep 17 00:00:00 2001 From: Maciej Lisiewski Date: Wed, 31 Mar 2021 19:46:47 -0400 Subject: [PATCH 2/3] Add requirements.txt file --- requirements.txt | 13 +++++++++++++ 1 file changed, 13 insertions(+) create mode 100644 requirements.txt diff --git a/requirements.txt b/requirements.txt new file mode 100644 index 0000000..daddf98 --- /dev/null +++ b/requirements.txt @@ -0,0 +1,13 @@ +skyfield==1.37 +numpy==1.20 +geog==0.0.2 +h3==3.7.2 +s2sphere==0.2.5 +shapely==1.7.1 +matplotlib==3.4.1 +geos==0.2.3 +pyproj==3.0.1 +requests==2.25 +# cartopy can't be installed in the same pip command as numpy +# see https://github.com/SciTools/cartopy/issues/1626 +# cartopy==0.18 \ No newline at end of file From c27c6b445533a0bed05ab28f77d6786752326bb2 Mon Sep 17 00:00:00 2001 From: Maciej Lisiewski Date: Fri, 2 Apr 2021 14:51:04 -0400 Subject: [PATCH 3/3] Add a makefile --- Makefile | 14 ++++++++++++++ 1 file changed, 14 insertions(+) create mode 100644 Makefile diff --git a/Makefile b/Makefile new file mode 100644 index 0000000..ddd0270 --- /dev/null +++ b/Makefile @@ -0,0 +1,14 @@ +CURRENT_DATE = $(shell date -u +%F) + +.PHONY: install-deps generate-coverage + +install-deps: + pip3 install -r requirements.txt + # this has to be installed separately until 0.19 is released https://github.com/SciTools/cartopy/issues/1552 + pip3 install cartopy==0.18 + +generate-coverage: + seq 0 3 | parallel -j4 "./main.py {}" + mv tle_cache/starlink.txt tle_cache/starlink-${CURRENT_DATE}.txt + mv h3_4_cov_full.txt h3_4_cov_op_${CURRENT_DATE}.txt + ./merge_cover.py \ No newline at end of file