Skip to content
This repository has been archived by the owner on Dec 5, 2019. It is now read-only.

Jupyter notebooks demonstrating use of S3, RDS, Datacube Core & PyCCD

License

Notifications You must be signed in to change notification settings

repository-preservation/aws-pyccd

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

About

Project for running Open Data Cube on AWS.

Setup

This project requires miniconda: https://conda.io/docs/install/quick.html

conda config --add channels conda-forge

conda create --name aws-pyccd python=3.5 datacube

source activate aws-pyccd

pip install lcmap-pyccd

conda install jupyter matplotlib scipy -y

Credentials

This project uses Boto3 to work with AWS, you will need to add credentials to ~/.aws/config that allow you to write to an S3 bucket.

[default]
aws_access_key_id = YOUR_ACCESS_KEY
aws_secret_access_key = YOUR_SECRET_KEY

Credentials

This project uses Boto3 to work with AWS, you will need to add credentials to ~/.aws/config that allow you to write to an S3 bucket.

[default]
aws_access_key_id = YOUR_ACCESS_KEY
aws_secret_access_key = YOUR_SECRET_KEY

Usage

Start the notebook server.

bin/notebook-server

If AWS RDS is not available, a local Docker instance of Postgres can be easily run:

docker run --name some-postgres -p 5432:5432 -e POSTGRES_PASSWORD=mysecretpassword postgres

License

TBD

About

Jupyter notebooks demonstrating use of S3, RDS, Datacube Core & PyCCD

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 59.9%
  • Jupyter Notebook 36.5%
  • Shell 3.6%