-
Notifications
You must be signed in to change notification settings - Fork 170
Architecture Overview
The Blue Alliance is a webapp that runs on Google App Engine, a managed serverless platform running on Google's cloud.
There are a few reasons why Appengine is a good fit for TBA.
- It is fully managed and serverless, so we don't have to worry about maintaining hardware. Instead, we can focus solely on writing our business logic
- Elastic capacity and supports autoscaling. This means we can seamlessly size our deployment based on instantaneous load. Since FRC traffic patterns are inherently spikey (on days teams register, many people refresh the site, for example), this lets us adjust to demand without needing a person to manually upscale the site
- Generous free tier. This has gotten less true over time, but we used to be able to run the entire offseason within the free tier (there are too many offseason events these days, which is a good problem to have!). Since TBA is a community project, keeping costs down is an important consideration.
TBA is built on the following technologies:
- Google App Engine to host the site. Take a look at the overview page for more, especially the concept of "services"
- Flask for a webapp framework
- Jinja2 for HTML template rendering
- Google Cloud Datastore as a persistent database
- Google Cloud Memorystore for a hosted, managed Redis cache
- Google Cloud Tasks for asynchronous execution
- FRC Events API as a source for official FRC data
The main webapp has two components: the "frontend" and the "backend", and they share the datastore between them.
The frontend receives HTTP requests from users, reads data from the datastore, and renders a response. Responses can take the form of webpages that you load in your browser or API responses containing JSON representation of the underlying data.
The backend does not handle user-initiated requests. It instead servers requests triggered by App Engine's cron service. Cron is a means of scheduling tasks to run a predetermined time of day or at a preconfigured interval. We periodically run a job to fetch data from an upstream source (like FIRST's APIs), compute derived statistics based on data in the datastore, or do other noninteractive processing. In short, the backend will write data for the frontend to later read and display to users.
The best place to go next is to read about the data to get a sense for how the pieces of the site fit together.