This repository contains the files for Udacity's Robotics Nanodegree's Fifth Project.
Find Udacity's Robotics Software Engineering Nanodegree here.
This project serves as a simulation of a real-life home service robot in Gazebo using ROS. The robot uses its sensors to localize itself and map the environment in order to pick up different elements inside the environment and transport them to other locations using Dijkstra's algorithm.
The purpose of this final project is to implement all previous major skills learnt throughout the Udacity Nanodegree. These skills include but are not limited to the following.
- Building simulation environments
- Simulating robots using ROS and Gazebo
- Developing ROS software (nodes, publishers, subscribers, etc.)
- Performing robot localization (through the Adaptive Monte-Carlo Localization algorithm)
- Performing SLAM (through the Real-Time Appearance-Based Mapping package)
- Performing path planning (using Dijkstra's algorithm)
- Integrating various ROS packages into one functional stack
- Design a simulation environment inside Gazebo
- Simulate the turtlebot robot inside the designed environment
- Generate a 2D occurancy grid map (
map/map.pgm
) of the simulation environment through thegmapping
package by teleoperating the robot - Perform localization through the
amcl
package - Test the navigation stack through sending 2D Nav Goals to the robot inside RViz
- Write a pick_objects node that is responsible of actuating the robot to various locations inside the mapped environment
- Write an add_markers node that is responsible of publishing marker locations to simulate picking up and dropping off objects
The requirements of this project were met as approved by Udacity's reviewers.
Fig.1 - Robot inside Gazebo simulation environment
In order to successfully run all functionalities of this project, you should have ROS Kinetic and Gazebo 7 installed.
$ mkdir -p ~/catkin_ws/src/
$ cd ~/catkin_ws/src/
$ catkin_init_workspace
$ git clone https://github.com/moudallal/RoboticsND-Project5.git
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash
$ cd src/scripts
$ sudo chmod +x *.sh
In order to test the SLAM algorithm, run the following command in your terminal.
$ ./test_slam.sh
You can teleoperate the robot around the environment in order to map it. The 2D and 3D maps of the environment are shown below.
Fig.2 - Generated 2D Map of the simulation environment
Fig.3 - Generated 3D Map of the simulation environment
In order to test the navigation algorithm, run the following command in your terminal.
$ ./test_navigation.sh
You can press on the 2D Nav Goal button inside RViz and point in different locations inside the map to let the robot plan paths insire the environment
In order to run the full project, run the following command in your terminal.
$ ./home_service.sh
This will make the robot pickup a red cube object from one side of the map and then transport it to drop it off at the opposite side of the map. Screenshots of the process can be found below.
Fig.4 - Robot picking up red cube
Fig.5 - Robot planning paths inside the simulation environment
You can change the different pickup and dropoff positions inside add_markers/src/add_markers_node.cpp
and test out different processes.
This project was fully done by myself.