Skip to content

This assignment is developed with ROS packages which has a survilance scenario with GAZEBO WITH MOBILE MANIPULATOR

License

Notifications You must be signed in to change notification settings

yeshwanthguru/Experimental-robotics-2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

13 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿ•ต๏ธ Experimental Robotics - The ROS-Based Survilance ๐Ÿค–

Mobile Robot Icon

๐Ÿค– Name: Yeshwanth Guru Krishnakumar

๐Ÿ”Œ Reg No: 5059111

๐Ÿ“ง Email: yeshwanth445@gmail.com

๐Ÿ“œ Check out the Experimental_assignment2_Documentation for my latest robotic project:

๐Ÿค– Experimental_assignment2_Documentation ๐Ÿš€

๐Ÿค– INTRODUCTION: ๐ŸŽฒ

Welcome to my latest ROS project, an investigation-inspired scenario that demonstrates the power of ROS packages in robotics! This project features an interactive scenario like a ontology based survilance robot. Check out the image below for a sneak peek of ROS package:

out.mp4.1.mp4

๐Ÿงฑ SOFTWARE ARCHITECTURE: ๐Ÿ—๏ธ

This project features a robust software architecture that utilizes a range of cutting-edge technologies and methodologies to create a seamless user experience. Check out the diagrams below to see the high-level structure of the system:

๐Ÿ”น UML Diagram: ๐Ÿ“Š



UMLDiagram drawio (3)



๐Ÿš€ ARCHITECTURE WORKING PROCESS: ๐Ÿ”ง

ROBOT_STATE_MACHINE :

๐Ÿค–๐Ÿ—บ๏ธ This script is a ROS-based state machine for a robot to navigate through a topological map of an environment. It ๐Ÿ“ฅ loads the ontology map of the environment ๐ŸŒ, ๐Ÿค– initializes the robot's sensors ๐ŸŽ›๏ธ, and defines a state machine with three states: Load_Environment ๐Ÿ“ฅ, Normal_mode ๐Ÿƒ, and Emergency_mode ๐Ÿšจ.

In Normal_mode state, the robot chooses the next corridor to move ๐Ÿƒ or switch to Emergency_mode ๐Ÿšจ. In Navigationway state, the robot navigates ๐Ÿšถ through the selected corridor. In Emergency_mode state, the robot navigates to the nearest emergency location ๐Ÿš‘๐Ÿšจ.

The robot's sensors are instantiated in the main() function, and the state machine is initialized using the StateMachine() class. The transitions between states are defined using the add() and add_transition() methods of the StateMachine class. The script uses the ArmorClient library to manipulate the ontology map of the environment ๐Ÿง.

The script also defines a function to load the ontology map from a file ๐Ÿ“ and apply buffered changes to the ontology, and a function to save the ontology file when the script ends ๐Ÿ’พ. The script uses the rospy library to initialize a ROS node ๐ŸŒ and log information ๐Ÿ“. The script uses the smach library to define and execute the state machine ๐Ÿค–. ๐Ÿ“๐Ÿƒโ€โ™‚๏ธ๐Ÿฅ

๐Ÿง  ROBOT_BRAIN:

๐Ÿค– A robot ๐Ÿค– is following a plan consisting of random points ๐Ÿงญ. The plan is generated by a Planning Action Server ๐Ÿ—บ๏ธ, and the robot's movement is controlled by a Controlling Action Server ๐Ÿ•น๏ธ. The robot navigates through the plan by reaching each point with a random delay โฐ. The delay is between two values provided as parameters to the Controlling Action Server.

The start and end points of the plan are set using a service called SetPose ๐Ÿ“. The server checks if the points are within the environment limits ๐ŸŒณ. The robot's current position is also set using SetPose ๐Ÿ”. The system logs messages at each step to keep track of the robot's progress ๐Ÿ“.

The robot is on a mission to explore the environment and reach its destination safely ๐Ÿš€. It moves through the environment, encountering obstacles and making decisions on the fly. The robot's sensors help it detect the environment and make decisions based on the data ๐Ÿค–๐Ÿ”ฌ.

๐Ÿ”ฎ ROBOT_BRAIN2:

๐Ÿค–๐Ÿ“จ This script defines a class ActionClientcortex that implements a SimpleActionClient to send and cancel goals to a specified service. It also defines a class sensory1 that acquires locations from a database using ArmorClient, determines the robot's current location ๐Ÿ—บ๏ธ, chooses a target location based on the robot's position ๐Ÿ“, reachable locations ๐Ÿšถ, and urgency ๐Ÿšจ.

In the working scenario, the robot uses sensory1 to determine its current location and choose a target location ๐ŸŽฏ. Then, the ActionClientcortex is used to send a control action to move the robot to the target location ๐Ÿƒ. The robot moves to the target location ๐Ÿš€, and the process repeats until a stopping condition is met ๐Ÿ›‘.

During the execution of the script, the system logs messages to keep track of the robot's progress ๐Ÿ“. The robot's position and the target location are updated at each iteration to ensure the robot reaches the desired location ๐Ÿ“๐ŸŽฏ. The urgency ๐Ÿšจ of the target location can affect the robot's movement speed, and the system is designed to handle emergency situations accordingly ๐Ÿ’ผ.

bug_m:

๐Ÿค–๐ŸŒŸ This script is a super cool implementation of a bug algorithm for a mobile robot! ๐Ÿค–๐Ÿ‘

The algorithm allows the robot to navigate towards a desired position while avoiding obstacles in real-time using laser range data. ๐Ÿš€๐Ÿ”ฌ

What's even cooler is that it uses ROS (Robot Operating System) for communication between nodes, message passing, and service calls. ๐Ÿค–๐Ÿ’ฌ This makes it a powerful tool for autonomous navigation of robots in various settings, including warehouses, factories, and search and rescue operations. ๐Ÿ”๐Ÿญ๐Ÿš’

With the use of this algorithm, robots can safely navigate around obstacles while reaching their target destination. ๐Ÿค–๐Ÿงญ And that's not all, it opens up a whole new world of possibilities for robotics and automation! ๐Ÿค–๐ŸŒŽ

Overall, this script showcases the amazing potential of robotics and how we can leverage technology to create intelligent systems that can help us in various fields. ๐Ÿ™Œ๐Ÿค–

NAV :

๐Ÿค–๐Ÿš€ This script is perfect for controlling a mobile robot to move towards a given point in space in real-time applications! ๐ŸŒŸ

The program has three states that work together to guide the robot towards the target position. ๐Ÿค–๐Ÿ’ป

State 1: Rotate towards the goal position ๐Ÿ”„

The program calculates the error between the desired orientation and the robot's orientation. Sends a Twist message to the robot to rotate in the direction that reduces the error. The robot continues to rotate until it reaches an acceptable error represented by yaw_precision_2_. State 2: Move straight ahead ๐Ÿšถ

The program calculates the error between the desired position and the robot's position. Sends a Twist message to the robot to move forward in the direction that reduces the error. The robot continues moving forward until it reaches an acceptable distance represented by dist_precision_. State 3: Goal reached ๐Ÿ

The program sends a Twist message to stop the robot. If the robot is too far from the goal position, the program returns to state 1 and starts again. The program also listens to a service call to start and stop the robot's movement and a topic to receive the goal position. ๐Ÿ“ก๐Ÿ‘‚

Running at a rate of 20 Hz, this program is perfect for autonomous navigation of robots in various real-time settings, such as warehouses, factories, and search and rescue operations. ๐ŸŒŽ๐Ÿ‘จโ€๐Ÿš’๐Ÿ‘ทโ€โ™‚๏ธ

SEND GOAL TO ARM :

๐Ÿฆพ๐Ÿค– This Python script is a powerful tool for controlling a robotic arm using the MoveIt! package and ROS. Here's how it works:

Initializes the ROS and MoveIt! nodes. ๐Ÿš€๐ŸŒŸ This sets up the environment for controlling the robotic arm. Instantiates a MoveGroupCommander object for the robotic arm. ๐Ÿค–๐Ÿ’ป This object allows the program to control the arm's movements. Sets named targets for the arm to move to. ๐ŸŽฏ๐Ÿ‘€ These targets represent specific positions for the arm to move to, such as "home", "left_45", "horse", etc. Plans and executes a trajectory to reach each target. ๐Ÿ›ฃ๏ธ๐Ÿค– The program uses the MoveIt! package to plan a trajectory for the arm to reach each target position and then executes the trajectory to move the arm. Shuts down the MoveIt! and ROS nodes. ๐Ÿ”Œ๐Ÿ‘‹ Once the arm has completed all movements, the program shuts down the MoveIt! and ROS nodes. Initializes a new ROS node. ๐Ÿš€๐ŸŒŸ This sets up a new node for the program to perform additional tasks. Publishes a boolean message on the "/decision" topic. ๐Ÿ“ก๐Ÿ’ฌ The program publishes a message to the "/decision" topic. Sleeps for 1 second to wait for the publisher to initialize. ๐Ÿ’คโฐ This ensures that the publisher has time to initialize before continuing with the program. Sets the boolean message to True and publishes it. โœ…๐Ÿ“ก The program sets the message to True and publishes it to the "/decision" topic. Signals to ROS that the program is done. ๐Ÿ›‘๐Ÿค– The program signals to ROS that it has completed all tasks.

WALL_FOLLOW :

This Python script implements a wall-following behavior for a robot using laser sensors in ROS. ๐Ÿค–๐Ÿ‘€๐Ÿšถโ€โ™‚๏ธ The script defines functions to control the robot's linear and angular velocities, and a class to send control actions to move the robot. The main() function initializes the ROS node, creates publishers, subscribers, and a service to switch the wall-follower on and off. The callback function clbk_laser() updates the distances to obstacles in different directions. The take_action() function determines the state of the robot based on the distances, calls the appropriate function to generate the Twist message, and changes the robot's state. The find_wall() function makes the robot move forward and turn left to find a wall. The turn_left() function makes the robot turn left. The follow_the_wall() function makes the robot follow the wall. ๐Ÿƒโ€โ™‚๏ธ๐Ÿ”„๐Ÿงฑ Screenshot from 2023-05-08 11-22-26

1_MeLo1SEz.mp4

Screenshot from 2023-05-08 00-24-05 Screenshot from 2023-05-08 11-53-32

๐Ÿ› ๏ธ Setup and Working Process:

๐Ÿ–ฅ๏ธ This project is developed using Docker which has all the necessary dependencies installed. If you don't want to use Docker and prefer to install the dependencies manually, you'll need to install Armor, SMACH, and since this is a ROS Noetic based project. Once the dependencies are installed, you'll need to clone the following Git link to your workspace and load the OWL file as per the command described in the script, adding the path of the OWL file manually at your convenience.

                                         https://github.com/yeshwanthguru/Experimental-Robotics-2.git

in your workspace and load the owl file as per the command described in the script as your convenience manually adding the path of the owl file . Then do

                                            catkin_make

rocketOnce everything build.To execute the script with simulation do roslaunch command and a python command in three different terminal in order.

                   1st idea               roslaunch expo_assignment_1 survailence_robot.launch
                   2nd idea                  roslaunch assignment2 assignment.launch
                                     python send_goal_to_arm.py

Once after the execution of the planning then a /decision topic is published to the statemachine for executing further process based on that further working takes place in the system.And moveit is used for the motion planning where the following can beed found in the folder ass2

Moveit

here are the benefits of using MoveIt in ROS Noetic :

๐Ÿš€ Simplified motion planning: MoveIt provides a simplified interface for motion planning that allows the user to specify a goal configuration and generate a trajectory to reach that configuration.

๐Ÿค– Support for various types of robots: MoveIt supports a wide range of robots, including manipulators, mobile robots, and humanoid robots, and provides a unified interface to control their motion.

๐Ÿ”Œ Integration with ROS: MoveIt is designed to work seamlessly with ROS, allowing users to leverage other ROS packages and tools for perception, navigation, and control.

๐Ÿงฉ Modular design: MoveIt has a modular design, which makes it easy to add new capabilities or modify existing ones.

๐ŸŽ“ Support for multiple kinematic solvers: MoveIt provides support for multiple kinematic solvers, making it possible to switch between solvers based on the robot's capabilities or the application requirements.

๐Ÿ” Advanced collision detection: MoveIt provides advanced collision detection algorithms that can efficiently check for collisions between the robot and the environment.

๐Ÿ“ˆ Trajectory optimization: MoveIt can optimize trajectories to minimize joint accelerations, joint velocities, or other criteria, resulting in smoother and more natural motion.

๐ŸŽจ Visualization tools: MoveIt provides visualization tools that allow the user to visualize the robot's motion and the environment, making it easier to debug and validate the robot's behavior.

๐ŸŽฎ Integration with Gazebo: MoveIt can be integrated with Gazebo, a popular robot simulator, allowing users to simulate and test their motion planning algorithms in a virtual environment.

๐Ÿ‘จโ€๐Ÿ‘ฉโ€๐Ÿ‘งโ€๐Ÿ‘ฆ Large user community: MoveIt has a large and active user community that provides support, documentation, and examples, making it easier for new users to get started and for experienced users to share their knowledge and best practices.

Results

The following has been attached at initially gave the result with the simulation environment:

System Limitations

๐Ÿ›‘ Our system currently has a few limitations that users should be aware of. The ontology needs to be loaded manually and the possible hypotheses are limited.

Improvements

๐Ÿค–๐Ÿ—บ๏ธ This assignment is a promising starting point for a ROS-based state machine that enables a robot to navigate a topological map of an environment with ontology. However, several technical improvements can be made to enhance its capabilities, including:

๐ŸŽ›๏ธ Sensor Fusion: By combining data from multiple sensors, such as cameras, lidars, and IMUs, the robot can create a more accurate map of the environment, improving its navigation and decision-making capabilities.

๐Ÿง  Machine Learning Integration: Incorporating machine learning algorithms can enable the robot to learn from its experiences and optimize its navigation strategy based on the environment it operates in.

๐ŸŒ Multi-Robot Coordination: By coordinating with other robots in the same environment, the robot can avoid collisions and optimize its path planning, ultimately improving overall efficiency.

๐Ÿ“ˆ Online Map Updating: Updating the ontology map in real-time can help the robot adapt to changes in the environment, such as moving obstacles or dynamic objects, making its navigation more accurate and reliable.

๐Ÿฅ Emergency Response Planning: Enhancing the emergency mode by incorporating more complex planning and response strategies can help the robot respond to emergency situations more efficiently, potentially saving lives. ๐Ÿš‘๐Ÿšจ

ADITIONAL IMPROVEMENT INSPIRED

๐Ÿค– By leveraging quantum-inspired algorithms, it is possible to implement autonomous navigation and decision-making capabilities in mobile robots, allowing them to operate in complex and dynamic environments with greater efficiency and accuracy. This can open up new possibilities for applications such as ๐Ÿšจ search and rescue, ๐Ÿญ industrial automation, and more. With ongoing research and development in this area, the potential for quantum-inspired robotics is only set to ๐Ÿ“ˆ grow in the future.It has been done with the integration of ros and qiskit.which was inspired from the http://www.quantum-robot.org/

where the following task can be executed by the decision making node Decision-make node

which has the single sensory integration for reference meanwhile the multi sensroy can be done based on the scenario that need to be achieved . By making these improvements, we aim to provide a more attractive and user-friendly system for our users. Our goal is to developed bio inspired mobile navigation.This proposal is for radical architecture approch with bio inspired.

About

This assignment is developed with ROS packages which has a survilance scenario with GAZEBO WITH MOBILE MANIPULATOR

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published