You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@snowman2 , thanks for pointing to these options. I tired the options you suggested but they did not help to release the memory.
However, when I store the entire raster to zarr storage with to_zarr(), and load with raster = xarray.open_zarr(...), I don't see any memory leaks when iterating through the data variables. This would look like
importrioxarrayasrxrimportxarrayasxrimportgcPATH="path_to_multi_band_vrt.vrt"defno_memory_leak():
# Read from VRT and save to zarr (one chunk per band)rxr.open_rasterio(PATH, band_as_variable=True, chunks={"x": -1, "y": -1}).to_zarr(some_temp_dataset)
# Open zarr and iterate over data vars.raster=xr.open_zarr(some_temp_dataset, chunks={"x": -1, "y": -1})
bands=list(raster.data_vars)
forbandinbands:
data=raster[band].copy(deep=True).load()
deldatagc.collect()
Code Sample, a copy-pastable example if possible
A "Minimal, Complete and Verifiable Example" will make it much easier for maintainers to help you:
http://matthewrocklin.com/blog/work/2018/02/28/minimal-bug-reports
Problem description
The allocated memory increases after each iteration.
Expected Output
The memory is released after each iteration, so one can process multi-band datasets that do not fit in memory.
Environment Information
Conda environment information (if you installed with conda):
Environment (
conda list
):The text was updated successfully, but these errors were encountered: