Skip to content
This repository has been archived by the owner on Apr 17, 2023. It is now read-only.

Feature requests #9

Open
11 of 22 tasks
appelmar opened this issue Nov 21, 2018 · 7 comments
Open
11 of 22 tasks

Feature requests #9

appelmar opened this issue Nov 21, 2018 · 7 comments
Labels
enhancement New feature or request

Comments

@appelmar
Copy link
Owner

appelmar commented Nov 21, 2018

This issue collections ideas for important features to be added. Highlighted items should be considered for the next minor release.

General Features:

  • Import data cubes from NetCDF file(s)
  • Automatic optimization of data cube graphs (by reordering)
  • cancel function in for long running operations
  • NetCDF export parameters chunking and compression
  • Masking on images before applying aggregation in image_collection_cube::read_chunk()
  • Chunk caching with JSON serialization as key

Data cube operations:

  • Band arithmetics
  • Filter pixels by predicates on band values
  • Reduction over space
  • cumulate_time
  • window_space
  • Rechunking
  • Zonal statistics
  • which_min and which_max reducers
  • fill_time operator
  • fill_space operators
  • user-defined reducers

Image Collection:

  • store per-band and per-image metadata in image collections (optional)
  • implement per-image metadata filters (e.g. cloud coverage) (optional)
  • Support input images without GeoTransform but GCPs (e.g. Sentinel 1)

CLI

  • gdalcubes translate tool to apply gdal_translate on all GDAL dataset of a given image collection
  • gdalcubes addo tool to compute GDAL overviews with gdaladdo on all GDAL dataset of a given image collection
@appelmar appelmar added the enhancement New feature or request label Nov 23, 2018
@DaChro
Copy link

DaChro commented Feb 20, 2019

  • apply cloud mask during data cube creation

@pierreroudier
Copy link

  • Ability to extend the reducers (eg custom R functions)

@appelmar
Copy link
Owner Author

@pierreroudier there is actually a (not very user-friendly) workaround to do this in the R package with the chunk_apply, function. However, I am planning to improve this by letting users pass a function to reduce_time and reduce_space.

For example, calling something like reduce_time(x, function(y) { summary(y)} ) on a single band input cube x would return a 6 band cube with time series summaries.

More general, y would be the complete time series of one pixel containing all bands and the function would return one or more values that are interpreted as new bands of the resulting cube.

I've added this to the feature request list on top of the issue.

@pierreroudier
Copy link

@appelmar Excellent. If that helps, my use case would be to reduce a time series of imagery using a non-standard reducer (ie other then your usual mean, median, etc).

@mdsumner
Copy link

mdsumner commented Oct 28, 2019

One thing I'm hoping to do is allow the assignment of the CRS to a collection series, it seems that none of our data-library NetCDF files have this recognized by GDAL itself. It should be enough to apply GDALSetProjection, but I can't see yet in gdalcubes where the right place for this is.

@appelmar I know you said it wasn't possible, but I can't see why? And, it might be easy for you to apply this? I consider it the responsibility of a configuration author, so it could be a new field in there - a full WKT input CRS, otherwise an error for a source that doesn't have it.

I've experimented with this using the warper and it seems fine:

https://github.com/hypertidy/vapour/blob/f3bc80ddeeef1e7148c6cab2ff81353deea429ad/src/raster_warp.cpp#L32

@appelmar
Copy link
Owner Author

@mdsumner Thanks, the current dev branches of gdalcubes and gdalcubes_R now support the definition of a global SRS in image collection formats (see example here). This field is of course optional, making sense only for data products using the same SRS for all images. During the creation of an image collection, all files are expected to use the provided SRS; if a file has a different SRS according to the GDAL metadata, a warning will be given.

Here is an R example, using the example collection format and data from https://www.ncei.noaa.gov/data/sea-surface-temperature-optimum-interpolation/access/avhrr-only/198307/:

library(gdalcubes)
library(magrittr)

files = list.files("/path/to/AVHRR_Only", pattern=".nc", full.names = TRUE)
x = create_image_collection(files, "/path/to/NOAA_OISST_AVHRR_Only.json")

raster_cube(x, cube_view(extent=x, srs="EPSG:4326", nx=500, dt="P1D")) %>%
  select_bands(c("sst")) %>%
  plot(key.pos=1, col=viridis::viridis)

@mdsumner
Copy link

Oh nice, thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants