Skip to content

sawyermade/mvp_grasp

Repository files navigation

FORKED FROM :

https://github.com/dougsm/mvp_grasp

GG-CNN + Multi-View Picking

This repository contains the implementation of the Multi-View Picking system and experimental code for running on a Franka Emika Panda Robot from the paper:

Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter

Douglas Morrison, Peter Corke, Jürgen Leitner

International Conference on Robotics and Automation (ICRA), 2019

arXiv | Video

For more information about GG-CNN, see this repository or this arXiv paper.

If you use this work, please cite the following as appropriate:

@inproceedings{morrison2019multiview, 
	title={{Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter}}, 
	author={Morrison, Douglas and Corke, Peter and Leitner, J\"urgen}, 
	booktitle={2019 IEEE International Conference on Robotics and Automation (ICRA)}, 
	year={2019} 
}

@inproceedings{morrison2018closing, 
	title={{Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach}}, 
	author={Morrison, Douglas and Corke, Peter and Leitner, J\"urgen}, 
	booktitle={Proc.\ of Robotics: Science and Systems (RSS)}, 
	year={2018} 
}

Contact

Any questions or comments contact Doug Morrison.

Setup

Hardware:

This code is designed around a Franka Emika Panda robot using an Intel Realsense D435 camera mounted on the wrist. A 3D-printalbe camera mount is available in the cad folder. DYMO M10 scales are used to detect grasp success.

The following external packages are required to run everything completely:

Installation:

Clone this repository into your ROS worksapce and run catkin make

Local python requirements can be installed by:

pip install -r requirements.txt

Packages Overview

  • dougsm_helpers: A set of common functions for dealing with ROS and TF that are used throughout.
  • scales_interface: A simple interface to a set of DYMO scales for reading weight.
  • ggcnn: Service and Node for running GG-CNN. Provides two
  • franka_control_wrappers: Add a simple velocity controller node and MoveIt commander for controlling the Panda robot.
  • mvp_grasping: ROS nodes for executing grasps using the Multi-View Picking approach, including baselines.

Running

To run grasping experiments:

# Start the robot and required extras.
roslaunch mvp_grasping robot_bringup.launch

# Start the camera, depth conversion and static transform
roslaunch mvp_grasping wrist_realsense.launch

# Start the scales interface
roslaunch scales_interface scales.launch

# Start the Multi-View Picking backend
roslaunch mvp_grasping grasp_entropy_service.launch
 
## Execute Grasping Experiment

# For Multi-View Picking
rosrun mvp_grasping panda_mvp_grasp.py

# For Fixed data-collection baseline
rosrun mvp_grasping panda_fixed_baseline.py

# For single-view open-loop grasping baseline
roslaunch ggcnn ggcnn_service.launch
rosrun mvp_grasping panda_open_loop_grasp.py

Configuration

While this code has been written with specific hardware in mind, different physical settings or cameras may be used by customising ggcnn/cfg/ggcnn_service.yaml and mvp_grasping/cfg/mvp_grasp.yaml. New robots and cameras will require major changes.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published