Skip to content

Roboskel-Manipulation/object_reaching

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

object_reaching

Description

This package provides the functionality for an object reaching game. Two objects are placed in front of a human participant and two other objects in front of a robotic manipulator. The human tries to approach one of the objects and through a prediction module we predict the object the human tries to reach. Based on the prediction, the robotic manipulator hits one of its objects.

Pipeline

  • Human monitoring: An RGB-D camera is used for the human monitoring and Openpose is utilized for the 2D human localization. The wrist pixels of the right hand are used for predicting the object the human tries to approach.

  • Human motion preprocessing: The human movements (2D Openpose pixels) are first filtered:

    • Removal of pixels corresponding to the rest at the beginning of the motion.
    • Removal of outliers.
  • Prediction: Pretrained models are used. The predition of the direction of the human motion is based on his distance from the target objects.

  • Robot motion: The robot moves to a direction according to the output of the human motion prediction module.

Files

  • scripts/input_process.py: Movement detection onset and end and movement filtering.
  • scripts/prediction.py: Prediction of human movement direction.
  • src/robot_motion.cpp: Robot motion generation.
  • scripts/result.py: Checks who reached the object first (human or robot).
  • config/prediction.yaml: Set the positions (pixels) of the human objects.
  • config/object_reaching.yaml: Set the positions (3D coordinates expressed in the base_link frame) of the robot objects, the initial position of the robot, the velocity with which the robot will hit the objects and the gain of the controller used for regulating the robot commanded velocities.

Models

In the models folder exist trained logistic regression models. The touching_objects folder contains models trained on objects which were touching each other, while the distant_objects folder contains models trained on objects set at a distance of 13cm. In both cases, the distance between the objects and the initial position of the human hand was 50cm.

Run

Note : Before running the demo, make sure to set the parameters xGoal, yGoal (pixels object positions) in the config/object_reaching.yaml file. One way to do it is to record "some" wrist OpenPose pixels when the human grabs each object and use their average as the object positions.

Run roslaunch object_reaching object_reaching.launch to launch the OpenPose ROS node used for the human monitoring, the motion detection node, the prediction node and the robot motion node.

Once the nodes have been launched, in another terminal run rosservice call /next_motion. This service call will initialize the loop of the game, namely the motion detection node will listen to the 2D output of the OpenPose node. At the end of the game, namely when the robot hits an object, the robot will stay still for 5 seconds and then automatically return to its initial position. Then, run again rosservice call /next_motion to start the second experiment.

To launch the real robot, check the Roboskel's UR3 repo.

Arguments

  • visual_input: True if using visual input to produce the 2D pixels either using the real camera or a rosbag. False if using already obtained 2D pixels.
  • live_camera: True if frames are generated by an RGB-D camera (False if they are generated by rosbags)
  • models_path: Set the absolute path of the models used for the prediction

NOTE: live_camera need to be set only if visual_input has been set to true.

Citation

If you want to cite this work, please use the following bibtex

@inproceedings{tsitos2021prediction,
  title={Real-time Feasibility of a Human Intention Method Evaluated Through a Competitive Human-Robot Reaching Game},
  author={Tsitos, Athanasios C and Dagioglou, Maria and Giannakopoulos, Theodoros},
  booktitle={17th ACM/IEEE International Conference on Human-Robot Interaction (HRI)},
  year={2022}

}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published