Code for the Copilot project, winner of the June 2017 BP-Imperial Hackathon Innovation Prize.
We use computer vision to track the sleepiness of a driver in real time, both with natural light and in darkness using an IR lamp. When fatigue is detected, an automatic trigger is sent to a Lex Copilot chatbot that engages with the driver and assists him on the road through simple tasks.
We built our own chatbot with Lex since Alexa does not allow yet it's programmatic activation.
So far, the following capabilities are available:
- The image recognition software is able to identify the driver by his name.
- To read out loud where the nearest BP gas station is, based on your geolocation.
- To send an automated SMS to one of your pre-defined contacts.
- To call the emergency services with a pre-recorded message.
- To play some music for you!
Use Python 2.7. We recommend using virtual envs or anaconda/miniconda. Install the
dependencies using pip install -r requirements.txt
as well as opencv and pyaudio
with conda install opencv -y && conda install -c mutirri pyaudio=0.2.7 -y
.
You'll need to install some libraries before for some of these dependencies to work,
especially dlib
can be a bit problematic (follow
this guide if you're having trouble.)
Also, you'll need to get the awscli.
This will be the painless way of doing it, but it is still work in progress. Ideally,
get docker and run docker run -v ~/.aws/:/root/.aws/ -i -t copilot /bin/bash
to run the demo.
- Joan Pujol: Google Maps for geolocation
- Eduardo González: Twilio, Flask
- Pablo Pérez: AWS Lex and Polly
- Juan Eiros: AWS Lex and Polly