Skip to content

pangeo-data/cog-best-practices

cog-best-practices

Best practices with cloud-optimized-geotiffs (COGs)

The goal of this repository is to determine best practices for accessing the increasing amount of COG data with Pangeo tooling (GDAL, Rasterio, Xarray, Dask).

A Cloud Optimized GeoTIFF (COG) is a regular GeoTIFF file, aimed at being hosted on a HTTP file server (or Cloud object storage like S3), with an internal organization that enables more efficient workflows on the cloud. It does this by leveraging the ability of clients issuing HTTP GET range requests to ask for just the parts of a file they need. Read more at https://www.cogeo.org

One great use-case of COGS is downloading small pieces of a big file to your laptop. Another use-case is accessing COGs from within the same datacenter where they are stored over very efficient network connections.

This repository focuses on distributed computing within the same datacenter using this great new AWS public dataset in us-west-2 https://registry.opendata.aws/sentinel-1/ (Sentinel-1 Synthetic Aperture Radar images covering the United States).

Computing environment

We can use Pangeo Cloud and Pangeo Binder on AWS us-west-2 to iterate on examples in a common computing environment, click the button below to run the notebooks in this repository interactivel via Pangeo Binder on AWS:

badge

For notebooks that don't require Dask clusters you can use mybinder.org (which runs in GCP and other data centers) with limited compute resources:

badge

Organization

For starters there are four notebooks in this repository with the following focus:

  1. Accessing a single COG
  2. Working with multiple COGs (concatenated in time)
  3. Dask LocalCluster
  4. Dask GatewayCluster

Unit tests and examples often are simplified to an extreme and consequently fail to translate to ‘real world examples’. At the other extreme, full scientific analysis or large-scale computations are complex and difficult to follow. The goal with these examples is to explore the middle ground - simple operations that are commonplace on ~10-1000GB datasets.

Goals

  1. Figure out ways to improve these notebooks for better efficiency and clarity (this might involve opening issues and pull requests in other projects)
  2. Add new notebooks for common workflows e.g. creating COGs, rechunking COGs, applying custom functions, reprojection...

About

Best practices with cloud-optimized-geotiffs (COGs)

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published