Project related link: NLSAR (Youtrack) • norlabsnow (Docker Hub) • NorLab_MPPI (GitHub) • SNOW_AutoRally (GitHub)
Containerized development workflow for the NorLab_MPPI and SNOW_AutoRally projects leveraging nvidia-docker technology. Believe it or not, it's configured for developing with ROS melodic in python 3.6.
Key benefit: custom dependency management, development environment consistency, easy deployment to robots compute box and results reproducibility.
Author: Luc Coupal
Quick start for the NorLab_MPPI project on x86 workstation
- Requirement: docker and nvidia container toolkit must be installed (follow install step 1 and 2)
- Tips: You can use the
--help
flag for usage instruction on mostds_*
command
# Create a directory for your development source code if you dont already have one
mkdir -p ~/Repositories && cd ~/Repositories
# Clone both repositories
sudo git clone https://github.com/norlab-ulaval/NorLab_MPPI.git
sudo git clone https://github.com/RedLeader962/Dockerized-SNOW.git
# Install aliases and check Nvidia NVCC
cd ~/Repositories/Dockerized-SNOW
source ds_setup.bash
# Pull the norlab-mppi-develop image from norlabsnow Dockerhub with the x86-ubuntu18.04 tag
sudo docker pull norlabsnow/norlab-mppi-develop:x86-ubuntu18.04
# Create a new docker image instance for development on your machine and start working on the
# NorLab_MPPI project using ROS melodic, Python 3 and Pytorch right away.
ds_instantiate_develop --runTag=x86-ubuntu18.04 --name=MyCoolName --src="$HOME/Repositories/NorLab_MPPI"
To open an terminal inside MyCoolNmae, use the following convenient script
ds_attach MyCoolNmae
or use sudo docker exec -it MyCoolName bash
Quick start for the NorLab_MPPI project on Apple M1 (arm64) workstation
- It's the same image as the
arm64-l4t
but with PyTorch and Numba compiled specifically forarm64-Darwin
- Tips: You can use the
--help
flag for usage instruction on mostds_*
command - Be advise, cuda is not supported on Apple computer so PyTorch and Numba will work on cpu
# Create a directory for your development source code if you dont already have one
mkdir -p ~/Repositories && cd ~/Repositories
# Clone both repositories
sudo git clone https://github.com/norlab-ulaval/NorLab_MPPI.git
sudo git clone https://github.com/RedLeader962/Dockerized-SNOW.git
# Install aliases
cd ~/Repositories/Dockerized-SNOW
source ds_setup.bash
# Pull the norlab-mppi-develop image from norlabsnow Dockerhub with the arm64-Darwin-ubuntu18.04 tag
sudo docker pull norlabsnow/norlab-mppi-develop:arm64-Darwin-ubuntu18.04
# Create a new docker image instance for development on your machine and start working on the
# NorLab_MPPI project using ROS melodic, Python 3 and Pytorch right away.
ds_instantiate_develop --runTag=arm64-Darwin-ubuntu18.04 --osx --name=MyCoolName --src="$HOME/Repositories/NorLab_MPPI"
To open an terminal inside MyCoolName, use the following convenient script
ds_attach MyCoolName
or use sudo docker exec -it MyCoolName bash
Quick start for the NorLab_MPPI project on Apple (x86) workstation
- Tips: You can use the
--help
flag for usage instruction on mostds_*
command - Be advise, cuda is not supported on Apple computer so PyTorch and Numba will work on cpu only
Use a x86 docker image and let docker use roseta to emulate the x86 architecture
# Install roseta
softwareupdate --install-rosetta
# Create a directory for your development source code if you dont already have one
mkdir -p ~/Repositories && cd ~/Repositories
# Clone both repositories
sudo git clone https://github.com/norlab-ulaval/NorLab_MPPI.git
sudo git clone https://github.com/RedLeader962/Dockerized-SNOW.git
# Install aliases
cd ~/Repositories/Dockerized-SNOW
source ds_setup.bash
# Pull the norlab-mppi-develop image from norlabsnow Dockerhub with the arm64-Darwin-ubuntu18.04 tag
sudo docker pull norlabsnow/norlab-mppi-develop:x86-ubuntu18.04
# Create a new docker image instance for development on your machine and start working on the
# NorLab_MPPI project using ROS melodic, Python 3 and Pytorch right away.
ds_instantiate_develop --platform='linux/amd64' --runTag=x86-ubuntu18.04 --osx --name=MyCoolName --src="$HOME/Repositories/NorLab_MPPI"
# Runnning `uname -m` inside the container will confirm the type of architecture
To open an terminal inside MyCoolname, use the following convenient script
ds_attach MyCoolname
or use sudo docker exec -it MyCoolName bash
Quick start for the SNOW_AutoRally project on x86 workstation
- Requirement: docker and nvidia container toolkit must be installed (follow install step 1 and 2)
- Tips: You can use the
--help
flag for usage instruction on any dockerized-snow bash script
# Create a directory for your development source code if you dont already have one
mkdir -p ~/Repositories && cd ~/Repositories
# Clone both repositories
sudo git clone https://github.com/RedLeader962/SNOW_AutoRally.git
sudo git clone https://github.com/RedLeader962/Dockerized-SNOW.git
cd ~/Repositories/Dockerized-SNOW
# Pull the norlab-mppi-develop image from norlabsnow Dockerhub with the x86-ubuntu18.04 tag
bash ds_build_dependencies.bash --x86 --GT-AR
bash ds_build_develop.bash --x86 --GT-AR
# Create a new docker image instance for development on your machine and start working on the
# NorLab_MPPI project using ROS melodic, Python 3 and Pytorch right away.
bash ds_instantiate_develop.bash --runTag=x86-ubuntu18.04 --name=THEgtar --src="$HOME/Repositories/SNOW_AutoRally"
.|'''.|
||.. '
····································· Dockerized-SNOW ••········································
''|||.
. '||
|'....|'
https://norlab.ulaval.ca
https://redleader962.github.io
root@norlab-og:/#
Then follow the step at SNOW_AutoRally: Autonomous Driving in Simulation using MPPI
roslaunch autorally_gazebo autoRallyTrackGazeboSim.launch
...
- ★ | Setup PyCharm for local development using a Docker Python interpreter
- Using the nvidia-docker image on a Jetson device (
arm64-l4t
) - Using the nvidia-docker image on a
x86
host - Building the
arm64-l4t
nvidia-docker image on ax86
host using qemu virtualization
- How-to nvidia-docker manualy (a quick start)
- Test AutoRally Configuration (Revised instruction)
- How-to push image localy builded image to docker hub from command line
To pull the latest image from docker hub, execute the following in terminal:
sudo docker pull <container name>:<tag>
with <container name>
= theImageName and <tag>
= theHostArchitecture
nvidia-docker Documentation:
- nvidia-docker: Build and run Docker containers leveraging NVIDIA GPUs
- NVIDIA Cloud Native Technologies
- Base image for jetson:
- Base image with CUDA and OpenGL support: