Skip to content

Autonomous ground exploration mobile robot which has 3-DOF manipulator with Intel Realsense D435i mounted on a Tracked skid-steer drive mobile robot. The robot is capable of mapping spaces, exploration through RRT, SLAM and 3D pose estimation of objects around it. This is an custom robot with self built URDF model.The Robot uses ROS's navigation…

License

fazildgr8/realsense_explorer_bot

Repository files navigation

realsense_explorer_bot

  • Autonomous environment exploration mobile robot which has 3-DOF manipulator with Intel Realsense D435i mounted on a Tracked differential drive mobile robot fully controlled with ROS in Jetson Nano board.
  • The robot is capable of mapping spaces, exploration through RRT, SLAM and 3D pose estimation/localization of objects around it.
  • Robot produces odometry through EKF Filter (ekf_robot_localization) which fuses the IMU (MPU6050) data and the wheel encoder odometry data.
  • The Robot uses Realsense D435i RGB-D sensor with dexterity for perception and with Rviz visulaization of Robot State, Point Cloud and the generated map.
  • The Robot uses Realtime Appearance Based Mapping (RTAB-map ROS package) for SLAM appliacation.
  • The Robot uses Jetson Nano as its main computer interfaced with the robot hardware (Arduino, Motor Controllers, MPU6050, Servo Controller) through custom Serial interface (not ROS Serrial).
  • PID controller implemented within the microcontroller to set the desired speeds to each motors.

Multi Application Video

multi_application_captioned.mp4

Robot Description

The realsense_explorer_description package consists of the Robot's URDF files, launch file for loading robot description, Rviz config,robot state and joint state publishers for the robot.

roslaunch realsense_explorer_description robot_bringup.launch

  • Make sure to change the Global Fixed frame from base_footprint to odom or map after launching the robot_control_ekf node mentioned below.

Robot Control Node Graph with EKF Localization

The realsense_explorer_control package consists of the nodes and launch file required to interface the robot hardware with ROS.

roslaunch realsense_explorer_control robot_control_ekf.launch

control_node_graph_main

  • /serial_connection_RobotHW - The Hardware Interface node through serial port which communicates with the microcontroller to get the Robot IMU sensor data, two wheel encoder readings and publishes the appropriate messages to other nodes. It also sends the desired wheel rates for differential drive back to the microcontroller. (Arduino Uno Code : realsense_explorer_bot/realsense_explorer_control/arduino/robot_diff_drive.ino)
  • /diff_drive_controller - The differential drive inverse kinematics node which receives the /cmd_vel (Linear x vel, Angular z vel) and produces the required wheel speeds in encoder ticks per second.
  • /odom_publisher - The node produces odometry through reading the wheel encoders
  • /robot_ekf_localization - The node fuses the odometry through wheel encoders and the Robot IMU sensor data to produce EKF filtered odometry of the robot.
  • /jointState_to_servos - The node which converts three servo joint position from the /joint_state_publisher to PWM signals for servo motors of the 3-DOF manipulator controlled by servo motor driver connected directly to Jetson Nano through I2C communication.
Joystick Control (Xbox 360 Controller)
roslaunch realsense_explorer_control robot_joystick.launch
  • LSB - Robot Movement
  • RSB - Look Around (Moves the Camera Around)
  • LT and RT - Robot Neck Movement

Robot 3D Perception with Multi Object Tracking

The realsense_explorer_perception package consists of all the peception related nodes/launch for 3D multi object tracking, PCL cloud stream and RTAB Mapping. The following sequence of launch is to be executed for Multi Object 3D tracking.

  • Start RGB-D Stream from Realsense D435i
roslaunch realsense_explorer_perception start_rs_camera.launch filters:=pointcloud
  • Start YoloV3 based Object Detection over the RGB image.
roslaunch realsense_explorer_perception object_localization.launch 
  • Start Multi Object Tracker node which projects detetected objects in 3D space and broadcasts to the TF tree.
rosrun realsense_explorer_perception yolo_MultiObject_track.py

Multi Object Tracking Demo Video

Multi_Object_Tracker_Video_compressed.mp4
  • Add Objects to be tracked while initiating the MultiObject_Tracker class object in yolo_MultiObject_track.py node script at Line 96.
# Example
if __name__=='__main__':
    rospy.init_node('MultiObject_Tracker')
    rate = rospy.Rate(10.0)
    # Add your Objects
    tracker = MultiObject_Tracker(obejcts_to_track = ['person','cup','bottle','chair'])
    tracker.start_subscribers()
    time.sleep(3)
    while not rospy.is_shutdown():
        tracker.objects_tf_send()
    rospy.spin()
The Robot's Objet Tracking method can be understod in from github.com/fazildgr8/realsense_bot

Robot RTAB Mapping 3D Space/ 2D Grid Map Demo

The Robot uses Realtime Appearance Based Mapping (rtabmap_ros) for mapping 3D spaces/2D occupancy grid maps. Further RTAB Map also works in localization mode.

  • Start RTAB Map for Mapping
roslaunch realsense_explorer_perception rtab_mapping.launch
  • Start RTAB Map for Localization
roslaunch realsense_explorer_perception rtab_mapping.launch localization:=true

RTAB Mapping Demo Video

RTAB_Mapping_demo_compressed.mp4

Robot Navigation

For the Robot's navigation a Move Base launch file and navigation parameters are within the realsense_explorer_navigation package under config folder.

roslaunch realsense_explorer_navigation move_base.launch
  • This Launch starts the Move Base node for the robot with the config parameters. The Globa Map is subscribed from the Grid Map generated by the the RTAB Node.
  • Just Configuring the RTAB Map launch to Mapping or Localization mode is enough and move_base.launch can be directly executed and used by setting a Goal Pose through RVIZ i.e. Navigation can used even without a full Map Built.

[Updates Coming Soon on using Navigation Stack with RRT for autonomous exploration]

Robot Package Dependencies

Notes on remote Robot Control

  • The robot description and the Rviz visulaization can be brought up in a master Desktop computer running a ROS core(Or Vice Versa).
  • The Robot control and Localization launch should be running in the Jetson Nano with it's ROS Master URI set to the Desktop computer's IP address (Or Vice Versa).
  • The robot's 3-DOF manipulator can be controlled through the Joint state publisher GUI or nodes publishing Joint angles to /joint_states_ct topic.
  • The robot can also be controlled using any Joystick supported with Linux

About

Autonomous ground exploration mobile robot which has 3-DOF manipulator with Intel Realsense D435i mounted on a Tracked skid-steer drive mobile robot. The robot is capable of mapping spaces, exploration through RRT, SLAM and 3D pose estimation of objects around it. This is an custom robot with self built URDF model.The Robot uses ROS's navigation…

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published