Skip to content

Your digital futsal companion: find a friendly opponent, look for a team to join, analyze vanity metrics, and receive jolly tactical advice

License

Notifications You must be signed in to change notification settings

sborms/futsalfriend

Repository files navigation

⚽ Futsal Friend: Your Digital Futsal Companion

Tests Scraper

For those in love with futsal as a way to enjoy sports and to maximize time with friends.

🏆 Do you need an opponent for a friendly?

👫 Do you want to find a team as a new player?

😏 Do you care to analyze vanity statistics?

📣 Do you desire to receive jolly tactical advice?

Futsal Friend is a small web application built using player and competition data from Liefhebbers Zaalvoetbal Cup (LZV Cup), a Belgian futsal organisation counting over 900 teams. Its usefulness is not limited to players from one of the LZV Cup leagues.

demo Futsal Friend

Scraping

The data comes from the lzvcup.be website. Scraping it goes according to the steps defined here.

  • Step 1 - Define a set of main URLs in a config file. Each URL represents a region, which will serve as the starting point. See the config.json file.
  • Step 2 - For each of the areas, go to the URL and grab the so-called region cards which list the competitions and sportshalls in every region. Then extract the URLs for the respective competitions pages (e.g. 2e Klasse) and the single URL for the overview of sportshalls page.
  • Step 3 - Go to each competition's URL and gather following information:
    • The schedule (the games played and their scores, as well as the games planned);
    • The competition's standings;
    • The teams and their page URL.
  • Step 4 - For each of the team's page URL, grab following information:
    • The players and their current season's statistics (such as goals scored, and also every player's page URL);
    • The palmares of the team (i.e. the competition positions they achieved in the past).
  • Step 5 - Go to each region-specific sportshall's page URL and parse all individual sportshalls together with metadata like address and email address.
  • Step 6 - Go to each of the player's page URL and grab their historical statistics.
  • Step 7 - Transform all scraped data into several tables including relevant metadata (area, region, ...).
  • Step 8 - Store the tables in a database (in this case SQLite but alternatively on the cloud).

All relevant code is in the scraper/ folder. You can call make scrape to run the full scraping script main.py. It takes around 15-20 minutes, plus another 10-15 minutes if the geographic coordinates for the sportshalls need to be processed afresh.

The resulting database is not fully relational. It can be seen as a (very) lightweight data warehouse, set up in such a way that later aggregations in the web application can be done more efficiently.

Through a GitHub Actions workflow defined in .github/workflows/scraper.yaml, the data is scraped every Thursday early morning and an updated SQLite database is then pushed to the database/ folder. The web application is refreshed automatically afterwards.

The main scraping script includes some nice logging. See below! For more information about the logging setup, this Medium post helps.

logs

Deployment

The application is deployed on Streamlit Community Cloud with just a few clicks. To point to the SQLite database, check out this documentation.

To run the application with Docker, you can use these commands:

docker build -t futsalfriend -f webapp/Dockerfile .
docker run -p 8501:8501 futsalfriend

Main technologies

Python BeautifulSoup SQLite Docker Streamlit

Useful links

The following resources were helpful in setting up this application:

About

Your digital futsal companion: find a friendly opponent, look for a team to join, analyze vanity metrics, and receive jolly tactical advice

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published