Skip to content

ducktrA/CarND-Capstone

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction here. This repo is the place to work together.

Team Members of "Get Me Home" Team

Istvan Marton marton_i@hotmail.com Christian Grant tcpip001@gmail.com Alessandro Gulli alessandrogulli87@alice.it Oliver Witt oliver.witt@witt-or.de Adolf Hohl adolf.hohl@gmail.com (team initiator)

Approach

We started off with getting the base setup. First goal is a moveing vehicle. The waypoint updater does its job and drive by wire as well.

Rough Work Plan

Base to work in parallel is the ego vehicle going around the track. From there we currently seen a few tasks: smoothing the path and controllers, collect learning data for traffic light, train traffic light ahead detection, train traffic light recognition using either SVM or a NN.

SVM/Hog would have solved traffic light localization and also detection. Detection performance was ok but not exciting. An implementation using opencv library would have been a good starting base. We tried a simple FCN approch to localize the traffic lights. That worked surprisingly good. However experiments with SSD have shown that their output is much more precise. The approach we sticked with was to use an SSD for the TL bounding box, but to detect the lights manually using HSV color spaces. This gits a good control about the distance between the classes as the balance of colored pixels expresses this. Independent of this approach we cropped and zoomed to the traffic light responsible for the specific lane. A time synchronization of the location information led to pretty smooth crops. Both approaches are used at the end to cross-validate.

Native Installation

  • Be sure that your workstation is running Ubuntu 16.04 Xenial Xerus or Ubuntu 14.04 Trusty Tahir. Ubuntu downloads can be found here.

  • If using a Virtual Machine to install Ubuntu, use the following configuration as minimum:

    • 2 CPU
    • 2 GB system memory
    • 25 GB of free hard drive space

    The Udacity provided virtual machine has ROS and Dataspeed DBW already installed, so you can skip the next two steps if you are using this.

  • Follow these instructions to install ROS

  • Dataspeed DBW

  • Download the Udacity Simulator.

Docker Installation

Install Docker

Build the docker container

docker build . -t capstone

Run the docker file

docker run -p 127.0.0.1:4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone

Usage

  1. Clone the project repository
git clone https://github.com/udacity/CarND-Capstone.git
  1. Install python dependencies
cd CarND-Capstone
pip install -r requirements.txt
  1. Make and run styx
cd ros
catkin_make
source devel/setup.sh
roslaunch launch/styx.launch
  1. Run the simulator

Real world testing

  1. Download training bag that was recorded on the Udacity self-driving car (a bag demonstraing the correct predictions in autonomous mode can be found here)
  2. Unzip the file
unzip traffic_light_bag_files.zip
  1. Play the bag file
rosbag play -l traffic_light_bag_files/loop_with_traffic_light.bag
  1. Launch your project in site mode
cd CarND-Capstone/ros
roslaunch launch/site.launch
  1. Confirm that traffic light detection works on real life images

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published