-
Notifications
You must be signed in to change notification settings - Fork 30
Open
Description
Hi there,
I am encountering difficulties processing with sarsen due to enormous memory consumption. I get this error:
distributed/client.py:3363: UserWarning: Sending large graph of size 1.94 GiB.
This may cause some slowdown.
Consider loading the data with Dask directly
or using futures or delayed objects to embed the data into the graph without repetition.
See also https://docs.dask.org/en/stable/best-practices.html#load-data-with-dask for more information.
warnings.warn(
2025-08-01 13:05:30,034 - distributed.worker.memory - WARNING - Unmanaged memory use is high. This may indicate a memory leak or the memory may not be released to the OS; see https://distributed.dask.org/en/latest/worker-memory.html#memory-not-released-back-to-the-os for more information. -- Unmanaged memory: 46.49 GiB -- Worker memory limit: 64.00 GiB
2025-08-01 13:09:51,310 - distributed.worker.memory - WARNING - Worker is at 80% memory usage. Pausing worker. Process memory: 51.58 GiB -- Worker memory limit: 64.00 GiB
2025-08-01 13:09:51,474 - distributed.worker.memory - WARNING - Worker is at 79% memory usage. Resuming worker. Process memory: 50.70 GiB -- Worker memory limit: 64.00 GiB
I am passing a DEM in UTM projection with a resolution of 100 m, which only covers parts of the scene. So the amount of data should not be a problem.
Here's my function call:
rtc = terrain_correction(
product=product,
dem_urlpath=dem,
output_urlpath=gt,
simulated_urlpath=lc,
correct_radiometry='gamma_bilinear',
interp_method='linear',
grouping_area_factor=(1, 1),
chunks=512,
radiometry_chunks=512,
enable_dask_distributed=True
)
Any ideas what could be going wrong?
Metadata
Metadata
Assignees
Labels
No labels