Skip to content
Philipp Foehn edited this page May 14, 2018 · 2 revisions

Model Predictive Control (MPC) for Quadrotors by "Robotics and Perception Group" at "University of Zurich". This MPC is intended to be used with rpg_quadrotor_control

ToDo

  • Finish ReadMe and Wiki
  • Implement standalone-mode

Usage

Copyright (C) 2017-2018 Philipp Foehn, Robotics and Perception Group, University of Zurich

The RPG MPC repository provides packages that are intended to be used with RPG Quadrotor Control and ROS. This code has been tested with ROS kinetic on Ubuntu 16.04. This is research code, expect that it changes often and any fitness for a particular purpose is disclaimed. For a commercial license, please contact Davide Scaramuzza.

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

Publication PAMPC

This MPC is also capable of doing perception-aware model predictive control. For a detail explanation, read our paper, watch the video, and checkout PAMPC.

Feel free to watch our PAMPC video:

PAMPC: Perception-Aware Model Predictive Control for Quadrotors

If you use this code in an academic context, please cite the following arXiv.org preprint.

Davide Falanga, Philipp Foehn, Peng Lu, Davide Scaramuzza: PAMPC: Perception-Aware Model Predictive Control for Quadrotors, under review for 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018).

@article{Falanga2018
  author = {Falanga, Davide and Foehn, Philipp and Lu, Peng and Scaramuzza, Davide},
  title = {{PAMPC}: Perception-Aware Model Predictive Control for Quadrotors},
  journal = {arxiv},
  arxivid = {1804.04811},
  url = {https://arxiv.org/abs/1804.04811},
  year = {2018}
}

Abstract:

We present a perception-aware model predictive control framework for quadrotors that unifies control and planning with respect to action and perception objectives. Our framework leverages numerical optimization to compute trajectories that satisfy the system dynamics and require control inputs within the limits of the platform. Simultaneously, it optimizes perception objectives for robust and reliable sensing by maximizing the visibility of a point of interest and minimizing its velocity in the image plane. Considering both perception and action objectives for motion planning and control is challenging due to the possible conflicts arising from their respective requirements. For example, for a quadrotor to track a reference trajectory, it needs to rotate to align its thrust with the direction of the desired acceleration. However, the perception objective might require to minimize such rotation to maximize the visibility of a point of interest. A model-based optimization framework, able to consider both perception and action objectives and couple them through the system dynamics, is therefore necessary. Our perception-aware model predictive control framework works in a receding-horizon fashion by iteratively solving a non-linear optimization problem. It is capable of running in real-time, fully onboard our lightweight, small-scale quadrotor using a low-power ARM computer, together with a visual-inertial odometry pipeline. We validate our approach in experiments demonstrating (I) the contradiction between perception and action objectives, and (I) improved behavior in extremely challenging lighting conditions.