The purpose of this project is to perform web scraping only with a shell script. For this, I used the curl
command to retrieve the content of a web page, then grep
, tail
, and head
to filter the content of the page and retrieve only the price of the cryptocurrencies, and chose to save the result directly to a file, along with the date.
This "web scraping" part is done from the shell script scraper.sh.
The second part of the project is to display the retrieved data on a web dashboard created in Python with the Dash library.
This part is done with the python script dashboard.py.
The third part of this project is to produce daily reports everyday at 8pm, which must include the open and closing price, the evolution of the price, as well as the volatility. Using CRONs, a Python script is executed at 8pm and generates that daily report with prices since 8pm the previous day (since there is no open or close price with cryptocurrencies).
This part is done with the python script report.py.
I used cron jobs to have the scraping script retrieve the price of the cryptocurrencies every two minutes and create a report each day at 8pm. Command to use:
crontab -e
Lines to add to the list of cron jobs:
*/2 * * * * $PATH_TO_PROJECT/scraper.sh
0 20 * * * python3 $PATH_TO_PROJECT/report.py
The dashboard part is still running in debug mode but it can be started with:
python3 dashboard.py