-
Notifications
You must be signed in to change notification settings - Fork 29
/
Copy pathnsidc_icesat2_sync.py
589 lines (561 loc) · 24.9 KB
/
nsidc_icesat2_sync.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
#!/usr/bin/env python
u"""
nsidc_icesat2_sync.py
Written by Tyler Sutterley (03/2024)
Acquires ICESat-2 datafiles from the National Snow and Ice Data Center (NSIDC)
https://wiki.earthdata.nasa.gov/display/EL/How+To+Access+Data+With+Python
https://nsidc.org/support/faq/what-options-are-available-bulk-downloading-data-
https-earthdata-login-enabled
http://www.voidspace.org.uk/python/articles/authentication.shtml#base64
Register with NASA Earthdata Login system:
https://urs.earthdata.nasa.gov
Add NSIDC_DATAPOOL_OPS to NASA Earthdata Applications
https://urs.earthdata.nasa.gov/oauth/authorize?client_id=_JLuwMHxb2xX6NwYTb4dRA
CALLING SEQUENCE:
python nsidc_icesat2_sync.py --user=<username> --release=001 ATL06
where <username> is your NASA Earthdata username
INPUTS:
ATL03: Global Geolocated Photon Data
ATL04: Normalized Relative Backscatter
ATL06: Land Ice Height
ATL07: Sea Ice Height
ATL08: Land and Vegetation Height
ATL09: Atmospheric Layer Characteristics
ATL10: Sea Ice Freeboard
ATL12: Ocean Surface Height
ATL13: Inland Water Surface Height
COMMAND LINE OPTIONS:
--help: list the command line options
-U X, --user X: username for NASA Earthdata Login
-W X, --password X: password for NASA Earthdata Login
-N X, --netrc X: path to .netrc file for alternative authentication
-D X, --directory X: working data directory
-Y X, --year X: years to sync
-S X, --subdirectory X: specific subdirectories to sync
-r X, --release X: ICESat-2 data release to sync
-v X, --version X: ICESat-2 data version to sync
-t X, --track X: ICESat-2 reference ground tracks to sync
-g X, --granule X: ICESat-2 granule regions to sync
-c X, --cycle=X: ICESat-2 cycles to sync
-n X, --region X: ICESat-2 Named Region (ATL14/ATL15)
-a X, --auxiliary: Sync ICESat-2 auxiliary files for each HDF5 file
-I X, --index X: Input index of ICESat-2 files to sync
-F, --flatten: Do not create subdirectories
-P X, --np X: Number of processes to use in file downloads
-T X, --timeout X: Timeout in seconds for blocking operations
-R X, --retry X: Connection retry attempts
-l, --log: output log of files downloaded
-L, --list: print files to be transferred, but do not execute transfer
-C, --clobber: Overwrite existing data in transfer
--checksum: compare hashes to check if overwriting existing data
-M X, --mode X: Local permissions mode of the directories and files synced
PYTHON DEPENDENCIES:
numpy: Scientific Computing Tools For Python
https://numpy.org
https://numpy.org/doc/stable/user/numpy-for-matlab-users.html
lxml: Pythonic XML and HTML processing library using libxml2/libxslt
https://lxml.de/
https://github.com/lxml/lxml
PROGRAM DEPENDENCIES:
utilities.py: download and management utilities for syncing files
UPDATE HISTORY:
Updated 03/2024: use pathlib to define and operate on paths
Updated 09/2023: generalized regular expressions for non-entered cases
Updated 12/2022: single implicit import of altimetry tools
Updated 05/2022: use argparse descriptions within sphinx documentation
Updated 03/2022: use attempt login function to check credentials
Updated 02/2022: added option to sync specific orbital cycles
Updated 10/2021: using python logging for handling verbose output
Updated 07/2021: set context for multiprocessing to fork child processes
added option to compare checksums in order to overwrite data
added a file length check to validate downloaded files
Updated 05/2021: added options for connection timeout and retry attempts
Updated 04/2021: set a default netrc file and check access
default credentials from environmental variables
use regex backslash for comment special characters
Updated 02/2021: added regular expression patterns for ATL11/14/15
Updated 11/2020: nsidc_list will output a string for errors
Updated 10/2020: using argparse to set parameters
Updated 09/2020: use urllib imported in utilities
Updated 08/2020: moved urllib opener to utilities. add credential check
moved urllib directory listing to utilities
Updated 07/2020: added option index to use a list of files to sync
Updated 06/2020: added multiprocessing option for parallel download
Updated 05/2020: added option netrc to use alternative authentication
adjust regular expression to allow syncing of ATL07 sea ice products
adjust regular expression for auxiliary products
Updated 03/2020: added option flatten to not create subdirectories
Updated 09/2019: added ssl context to urlopen headers
Updated 07/2019: added options to sync specific granules, tracks and version
Updated 06/2019: use strptime to extract last modified time of remote files
Written 01/2019
"""
from __future__ import print_function
import sys
import os
import re
import io
import copy
import time
import shutil
import pathlib
import logging
import argparse
import posixpath
import traceback
import lxml.etree
import multiprocessing as mp
import icesat2_toolkit as is2tk
# PURPOSE: sync the ICESat-2 elevation data from NSIDC
def nsidc_icesat2_sync(DIRECTORY, PRODUCTS, RELEASE, VERSIONS, GRANULES,
TRACKS, YEARS=None, SUBDIRECTORY=None, CYCLES=None, REGION=None,
AUXILIARY=False, INDEX=None, FLATTEN=False, TIMEOUT=None, RETRY=1,
LOG=False, LIST=False, PROCESSES=0, CLOBBER=False, CHECKSUM=False,
MODE=0o775):
# check if directory exists and recursively create if not
DIRECTORY = pathlib.Path(DIRECTORY).expanduser().absolute()
DIRECTORY.mkdir(mode=MODE, parents=True, exist_ok=True)
# output of synchronized files
if LOG:
# format: NSIDC_ICESat-2_sync_2002-04-01.log
today = time.strftime('%Y-%m-%d',time.localtime())
LOGFILE = DIRECTORY.joinpath(f'NSIDC_ICESat-2_sync_{today}.log')
logging.basicConfig(filename=LOGFILE, level=logging.INFO)
logging.info(f'ICESat-2 Data Sync Log ({today})')
else:
# standard output (terminal output)
logging.basicConfig(level=logging.INFO)
# compile HTML parser for lxml
parser = lxml.etree.HTMLParser()
# remote https server for ICESat-2 Data
HOST = 'https://n5eil01u.ecs.nsidc.org'
# regular expression operator for finding files of a particular granule
# find ICESat-2 HDF5 files in the subdirectory for product and release
if TRACKS:
regex_track = r'|'.join([rf'{T:04d}' for T in TRACKS])
else:
regex_track = r'\d{4}'
if CYCLES:
regex_cycle = r'|'.join([rf'{C:02d}' for C in CYCLES])
else:
regex_cycle = r'\d{2}'
if GRANULES:
regex_granule = r'|'.join([rf'{G:02d}' for G in GRANULES])
else:
regex_granule = r'\d{2}'
if VERSIONS:
regex_version = r'|'.join([rf'{V:02d}' for V in VERSIONS])
else:
regex_version = r'\d{2}'
regex_suffix = r'(.*?)' if AUXILIARY else r'(h5|nc)'
default_pattern = (r'{0}(-\d{{2}})?_(\d{{4}})(\d{{2}})(\d{{2}})(\d{{2}})'
r'(\d{{2}})(\d{{2}})_({1})({2})({3})_({4})_({5})(.*?).{6}$')
ATL11_pattern = r'({0})_({1})({2})_(\d{{2}})(\d{{2}})_({3})_({4})(.*?).{5}$'
ATL1415_pattern = r'({0})_({1})_(\d{{2}})(\d{{2}})_({3})_({4})(.*?).{5}$'
# regular expression operator for finding subdirectories
if SUBDIRECTORY:
# Sync particular subdirectories for product
R2 = re.compile(r'('+r'|'.join(SUBDIRECTORY)+r')', re.VERBOSE)
elif YEARS:
# Sync particular years for product
regex_pattern = r'|'.join(rf'{y:d}' for y in YEARS)
R2 = re.compile(rf'({regex_pattern}).(\d+).(\d+)', re.VERBOSE)
else:
# Sync all available subdirectories for product
R2 = re.compile(r'(\d+).(\d+).(\d+)', re.VERBOSE)
# build list of remote files, remote modification times and local files
remote_files = []
remote_mtimes = []
local_files = []
# build lists of files or use existing index file
if INDEX:
# read the index file, split at lines and remove all commented lines
INDEX = pathlib.Path(INDEX).expanduser().absolute()
with INDEX.open(mode='r', encoding='utf8') as f:
files = [i for i in f.read().splitlines() if re.match(r'^(?!\#)',i)]
# regular expression operator for extracting information from files
rx = re.compile(r'(ATL\d{2})(-\d{2})?_(\d{4})(\d{2})(\d{2})(\d{2})'
r'(\d{2})(\d{2})_(\d{4})(\d{2})(\d{2})_(\d{3})_(\d{2})(.*?).h5$')
# for each line in the index
for f in files:
# extract parameters from ICESat-2 ATLAS HDF5 file
PRD,HEM,YY,MM,DD,HH,MN,SS,TRK,CYC,GRN,RL,VRS,AUX=rx.findall(f).pop()
# get directories from remote directory
product_directory = f'{PRD}.{RL}'
sd = f'{YY}.{MM}.{DD}'
PATH = [HOST,'ATLAS',product_directory,sd]
remote_dir = posixpath.join(HOST,'ATLAS',product_directory,sd)
# local directory
local_dir = pathlib.Path(DIRECTORY).expanduser().absolute()
if not FLATTEN:
local_dir = local_dir.joinpath(product_directory,sd)
# find ICESat-2 data file to get last modified time
# find matching files (for granule, release, version, track)
names,lastmod,error = is2tk.utilities.nsidc_list(PATH,
build=False,
timeout=TIMEOUT,
parser=parser,
pattern=f.strip()
)
# print if file was not found
if not names:
logging.critical(error)
continue
# add to lists
for colname,remote_mtime in zip(names,lastmod):
# remote and local versions of the file
remote_files.append(posixpath.join(remote_dir,colname))
local_files.append(local_dir.joinpath(colname))
remote_mtimes.append(remote_mtime)
else:
# for each ICESat-2 product listed
for p in PRODUCTS:
logging.info(f'PRODUCT={p}')
# get directories from remote directory
product_directory = f'{p}.{RELEASE}'
PATH = [HOST,'ATLAS',product_directory]
# compile regular expression operator
if p in ('ATL11',):
R1 = re.compile(ATL11_pattern.format(p,regex_track,
regex_granule,RELEASE,regex_version,regex_suffix))
elif p in ('ATL14','ATL15'):
regex_region = '|'.join(REGION)
R1 = re.compile(ATL1415_pattern.format(p,regex_region,
RELEASE,regex_version,regex_suffix))
else:
R1 = re.compile(default_pattern.format(p,regex_track,
regex_cycle,regex_granule,RELEASE,regex_version,
regex_suffix))
# read and parse request for subdirectories (find column names)
remote_sub,_,error = is2tk.utilities.nsidc_list(PATH,
build=False,
timeout=TIMEOUT,
parser=parser,
pattern=R2,
sort=True)
# print if subdirectory was not found
if not remote_sub:
logging.critical(error)
continue
# for each remote subdirectory
for sd in remote_sub:
# local directory
local_dir = pathlib.Path(DIRECTORY).expanduser().absolute()
if not FLATTEN:
local_dir = local_dir.joinpath(product_directory,sd)
logging.info(f"Building file list: {sd}")
# find ICESat-2 data files
PATH = [HOST, 'ATLAS', product_directory, sd]
remote_dir = posixpath.join(HOST,'ATLAS',product_directory,sd)
# find matching files (for granule, release, version, track)
names,lastmod,error = is2tk.utilities.nsidc_list(PATH,
build=False,
timeout=TIMEOUT,
parser=parser,
pattern=R1,
sort=True
)
# print if file was not found
if not names:
logging.critical(error)
continue
# build lists of each ICESat-2 data file
for colname,remote_mtime in zip(names,lastmod):
# remote and local versions of the file
remote_files.append(posixpath.join(remote_dir,colname))
local_files.append(local_dir.joinpath(colname))
remote_mtimes.append(remote_mtime)
# sync in series if PROCESSES = 0
if (PROCESSES == 0):
# sync each ICESat-2 data file
for i,remote_file in enumerate(remote_files):
# sync ICESat-2 files with NSIDC server
args = (remote_file, remote_mtimes[i], local_files[i])
kwds = dict(TIMEOUT=TIMEOUT,
RETRY=RETRY,
LIST=LIST,
CLOBBER=CLOBBER,
CHECKSUM=CHECKSUM,
MODE=MODE
)
output = http_pull_file(*args, **kwds)
# print the output string
logging.info(output) if output else None
else:
# set multiprocessing start method
ctx = mp.get_context("fork")
# sync in parallel with multiprocessing Pool
pool = ctx.Pool(processes=PROCESSES)
# sync each ICESat-2 data file
out = []
for i,remote_file in enumerate(remote_files):
# sync ICESat-2 files with NSIDC server
args = (remote_file, remote_mtimes[i], local_files[i])
kwds = dict(TIMEOUT=TIMEOUT,
RETRY=RETRY,
LIST=LIST,
CLOBBER=CLOBBER,
CHECKSUM=CHECKSUM,
MODE=MODE
)
out.append(pool.apply_async(multiprocess_sync,
args=args,kwds=kwds))
# start multiprocessing jobs
# close the pool
# prevents more tasks from being submitted to the pool
pool.close()
# exit the completed processes
pool.join()
# print the output string
for output in out:
temp = output.get()
logging.info(temp) if temp else None
# close log file and set permissions level to MODE
if LOG:
LOGFILE.chmod(mode=MODE)
# PURPOSE: wrapper for running the sync program in multiprocessing mode
def multiprocess_sync(*args, **kwds):
try:
output = http_pull_file(*args, **kwds)
except Exception as exc:
# if there has been an error exception
# print the type, value, and stack trace of the
# current exception being handled
logging.critical(f'process id {os.getpid():d} failed')
logging.error(traceback.format_exc())
else:
return output
# PURPOSE: pull file from a remote host checking if file exists locally
# and if the remote file is newer than the local file
# or if the checksums do not match between the files
def http_pull_file(remote_file, remote_mtime, local_file, TIMEOUT=None,
RETRY=1, LIST=False, CLOBBER=False, CHECKSUM=False, MODE=0o775):
# check if data directory exists and recursively create if not
local_file = pathlib.Path(local_file).expanduser().absolute()
local_file.parent.mkdir(mode=MODE, parents=True, exist_ok=True)
# chunked transfer encoding size
CHUNK = 16 * 1024
# if file exists in file system: check if remote file is newer
TEST = False
OVERWRITE = ' (clobber)'
# check if local version of file exists
# check if local version of file exists
if CHECKSUM and local_file.exists():
# generate checksum hash for local file
# open the local_file in binary read mode
local_hash = is2tk.utilities.get_hash(local_file)
# generate checksum hash for remote file
kwds = dict(TIMEOUT=TIMEOUT, RETRY=RETRY, CHUNK=CHUNK)
remote_buffer = retry_download(remote_file, **kwds)
remote_hash = is2tk.utilities.get_hash(remote_buffer)
# compare checksums
if (local_hash != remote_hash):
TEST = True
OVERWRITE = f' (checksums: {local_hash} {remote_hash})'
elif local_file.exists():
# check last modification time of local file
local_mtime = local_file.stat().st_mtime
# if remote file is newer: overwrite the local file
if (remote_mtime > local_mtime):
TEST = True
OVERWRITE = ' (overwrite)'
else:
TEST = True
OVERWRITE = ' (new)'
# if file does not exist locally, is to be overwritten, or CLOBBER is set
if TEST or CLOBBER:
# output string for printing files transferred
output = f'{remote_file} -->\n\t{local_file}{OVERWRITE}\n'
# if executing copy command (not only printing the files)
if not LIST:
# copy bytes or transfer file
if CHECKSUM and local_file.exists():
# store bytes to file using chunked transfer encoding
remote_buffer.seek(0)
with local_file.open(mode='wb') as f:
shutil.copyfileobj(remote_buffer, f, CHUNK)
else:
retry_download(remote_file, LOCAL=local_file,
TIMEOUT=TIMEOUT, RETRY=RETRY, CHUNK=CHUNK)
# keep remote modification time of file and local access time
os.utime(local_file, (local_file.stat().st_atime, remote_mtime))
local_file.chmod(mode=MODE)
# return the output string
return output
# PURPOSE: Try downloading a file up to a set number of times
def retry_download(remote_file, LOCAL=None, TIMEOUT=None, RETRY=1, CHUNK=0):
# attempt to download up to the number of retries
retry_counter = 0
while (retry_counter < RETRY):
# attempt to retrieve file from https server
try:
# Create and submit request.
# There are a range of exceptions that can be thrown here
# including HTTPError and URLError.
request=is2tk.utilities.urllib2.Request(remote_file)
response=is2tk.utilities.urllib2.urlopen(request,
timeout=TIMEOUT)
# get the length of the remote file
remote_length = int(response.headers['content-length'])
# if copying to a local file
if LOCAL:
# copy contents to file using chunked transfer encoding
# transfer should work with ascii and binary data formats
LOCAL = pathlib.Path(LOCAL).expanduser().absolute()
with LOCAL.open('wb') as f:
shutil.copyfileobj(response, f, CHUNK)
local_length = LOCAL.lstat().st_size
else:
# copy remote file contents to bytesIO object
remote_buffer = io.BytesIO()
shutil.copyfileobj(response, remote_buffer, CHUNK)
local_length = remote_buffer.getbuffer().nbytes
except:
pass
else:
# check that downloaded file matches original length
if (local_length == remote_length):
break
# add to retry counter
retry_counter += 1
# check if maximum number of retries were reached
if (retry_counter == RETRY):
raise TimeoutError('Maximum number of retries reached')
# return the bytesIO object
if not LOCAL:
# rewind bytesIO object to start
remote_buffer.seek(0)
return remote_buffer
# PURPOSE: create argument parser
def arguments():
parser = argparse.ArgumentParser(
description="""Acquires ICESat-2 datafiles from the National Snow and
Ice Data Center (NSIDC)
"""
)
# command line parameters
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument('products',
metavar='PRODUCTS', type=str, nargs='*', default=[],
help='ICESat-2 products to sync')
# NASA Earthdata credentials
parser.add_argument('--user','-U',
type=str, default=os.environ.get('EARTHDATA_USERNAME'),
help='Username for NASA Earthdata Login')
parser.add_argument('--password','-W',
type=str, default=os.environ.get('EARTHDATA_PASSWORD'),
help='Password for NASA Earthdata Login')
parser.add_argument('--netrc','-N',
type=pathlib.Path,
default=pathlib.Path.home().joinpath('.netrc'),
help='Path to .netrc file for authentication')
# working data directory
parser.add_argument('--directory','-D',
type=pathlib.Path,
default=pathlib.Path.cwd(),
help='Working data directory')
# years of data to sync
parser.add_argument('--year','-Y',
type=int, nargs='+',
help='Years to sync')
# subdirectories of data to sync
parser.add_argument('--subdirectory','-S',
type=str, nargs='+',
help='subdirectories of data to sync')
# ICESat-2 data release
parser.add_argument('--release','-r',
type=str, default='006',
help='ICESat-2 Data Release')
# ICESat-2 data version
parser.add_argument('--version','-v',
type=int, nargs='+',
help='ICESat-2 Data Version')
# ICESat-2 granule region
region = parser.add_mutually_exclusive_group(required=False)
region.add_argument('--granule','-g',
metavar='GRANULE', type=int, nargs='+',
choices=range(1,15), default=range(1,15),
help='ICESat-2 Granule Region')
# ICESat-2 orbital cycle
parser.add_argument('--cycle','-c',
type=int, nargs='+', default=None,
help='ICESat-2 orbital cycles to sync')
# ICESat-2 ATL14 and 15 named regions
ATL1415_regions = ['AA','AK','CN','CS','GL','IC','SV','RU']
region.add_argument('--region','-n',
metavar='REGION', type=str, nargs='+',
choices=ATL1415_regions, default=['AA','GL'],
help='ICESat-2 Named Region (ATL14/ATL15)')
# ICESat-2 reference ground tracks
parser.add_argument('--track','-t',
metavar='RGT', type=int, nargs='+',
choices=range(1,1388), default=range(1,1388),
help='ICESat-2 Reference Ground Tracks (RGTs)')
# sync auxiliary files
parser.add_argument('--auxiliary','-a',
default=False, action='store_true',
help='Sync ICESat-2 auxiliary files for each HDF5 file')
# sync using files from an index
group.add_argument('--index','-i',
type=pathlib.Path,
help='Input index of ICESat-2 files to sync')
# output subdirectories
parser.add_argument('--flatten','-F',
default=False, action='store_true',
help='Do not create subdirectories')
# run sync in series if processes is 0
parser.add_argument('--np','-P',
metavar='PROCESSES', type=int, default=0,
help='Number of processes to use in file downloads')
# connection timeout and number of retry attempts
parser.add_argument('--timeout','-T',
type=int, default=120,
help='Timeout in seconds for blocking operations')
parser.add_argument('--retry','-R',
type=int, default=5,
help='Connection retry attempts')
# Output log file in form
# NSIDC_IceSat-2_sync_2002-04-01.log
parser.add_argument('--log','-l',
default=False, action='store_true',
help='Output log file')
# sync options
parser.add_argument('--list','-L',
default=False, action='store_true',
help='Only print files that could be transferred')
# clobber will overwrite the existing data
parser.add_argument('--clobber','-C',
default=False, action='store_true',
help='Overwrite existing data')
parser.add_argument('--checksum',
default=False, action='store_true',
help='Compare hashes to check for overwriting existing data')
# permissions mode of the local directories and files (number in octal)
parser.add_argument('--mode','-M',
type=lambda x: int(x,base=8), default=0o775,
help='Permissions mode of output files')
# return the parser
return parser
# This is the main part of the program that calls the individual functions
def main():
# Read the system arguments listed after the program
parser = arguments()
args,_ = parser.parse_known_args()
# NASA Earthdata hostname
HOST = 'urs.earthdata.nasa.gov'
# build a urllib opener for NASA Earthdata
# check internet connection before attempting to run program
opener = is2tk.utilities.attempt_login(HOST,
username=args.user, password=args.password,
netrc=args.netrc)
# check NASA earthdata credentials before attempting to run program
nsidc_icesat2_sync(args.directory, args.products, args.release,
args.version, args.granule, args.track, YEARS=args.year,
SUBDIRECTORY=args.subdirectory, CYCLES=args.cycle,
REGION=args.region, AUXILIARY=args.auxiliary, INDEX=args.index,
FLATTEN=args.flatten, PROCESSES=args.np, TIMEOUT=args.timeout,
RETRY=args.retry, LOG=args.log, LIST=args.list,
CLOBBER=args.clobber, CHECKSUM=args.checksum, MODE=args.mode)
# run main program
if __name__ == '__main__':
main()