This repository contains a set of Bash scripts that make up a data pipeline, designed to automate the process of interacting with an SEL-735 meter. The pipeline is divided into two main executable scripts:
-
data_pipeline.sh
: Handles the first five steps:- Connecting to the meter via FTP
- Checking for new event files
- Downloading event files
- Organizing and creating metadata
- Compressing event data
-
archive_pipeline.sh
: Handles the final step:- Archiving and transferring event data to the Data Acquisition System (DAS)
- Notify new data transfers via MQTT
Ensure you have the following before running the pipeline:
- Unix-like environment (Linux, macOS, or a Unix-like Windows terminal)
- Able to
ssh
to the ot-dev and das.lab.acep.uaf.edu servers - FTP credentials for the meter
- MQTT Configuration
- Meter Configuration
- Installed on
camio-ot-dev
:lftp
— FTP operationsyq
- YAML config fileszip
- Compressing datarsync
— Transfering datamosquitto-clients
- MQTT clientjq
— JSON for MQTT payload
-
You must be connected to the
camio-ot-dev
server. See camio-ot-dev(SSH) in the ACEP Wiki. -
Clone the repository:
git clone git@github.com:acep-uaf/camio-meter-streams.git cd camio-meter-streams/cli_meter
Note: You can check your SSH connection with
ssh -T git@github.com
-
Navigate to the
config
directory and copy theconfig.yml.example
file to a newconfig.yml
file:cd config cp config.yml.example config.yml
-
Update the
config.yml
file with the FTP credentials and meter configuration data. -
Secure the
config.yml
file so that only the owner can read and write:chmod 600 config.yml
-
Navigate to the
config
directory and copy thearchive_config.yml.example
file to a newarchive_config.yml
file:cd config cp archive_config.yml.example archive_config.yml
-
Update the
archive_config.yml
file with the necessary broker information, as well as source and destination details. -
Secure the
archive_config.yml
file so that only the owner can read and write:chmod 600 archive_config.yml
To run the data pipeline and then transfer data to the Data Acquisition System (DAS):
-
Run the Data Pipeline First
Execute the
data_pipeline
script from thecli_meter
directory. The script requires a configuration file specified via the-c/--config
flag. If this is your first time running the pipeline, the initial download may take a few hours. To pause the download safely, see: How to Stop the Pipeline./data_pipeline.sh -c /path/to/config.yml
Optionally, you can use the
-d/--download_dir
flag to override the download directory from the config file../data_pipeline.sh -c /path/to/config.yml -d /path/to/download/dir/
-
Run the Archive Pipeline
After the
data_pipeline
script completes, execute thearchive_pipeline
script from thecli_meter
directory. The script requires a configuration file specified via the-c/--config
flag../archive_pipeline.sh -c /path/to/archive_config.yml
When you need to stop the pipeline:
- To Stop Safely/Pause Download:
- Use
Ctrl+C
to interrupt the process. - If you would like to resume the download, rerun the
data_pipeline
command.The download will resume from where it left off, provided the same config file (-c
)and download path (-d
) are used.
- Use
- Avoid Using
Ctrl+Z
:- Do not use
Ctrl+Z
to suspend the process, as it may cause the pipeline to end without properly closing the FTP connection.
- Do not use