Skip to content

Agroecology-Lab/Open-Weeding-Delta

 
 

Repository files navigation

Hardware Specifications:

Updated hardware specification in progress

Project Details

Inspired by Nindamani weed removal robot

Looking to improve on that based on the hardware linked above

Open Weeding Delta,autonomously detects and segment the weeds from crop using artificial intelligence. It's built on ROS2. Open Weeding Delta could be used in any early stage crops for autonomous weeding using mechanical or laser actuators

Parameter Value
Robotics OS ROS2.0 Humble
System Ubuntu 22.04 LTS
Kernel Realtime kernel
Communication Wireless, Canbus, UART(internal motor control)
Vision DepthAI-ROS
AI Framework Keras
Object instance segmentation Mask R-CNN
Programming Language Python3 & C
Parameter Value
Degrees of freedom 3 DOF
Error ? mm
Payload 0.5 kg
Weight 8 kg
Height TBC to TBC mm
Width TBC mm
Arm Reach TBC sq mm
Processor board Jetson nano Dev Kit
Microcontroller TBC
Stepper Motor /BLDC 48V, 6A, Nema 34, 87 kgcm H.Torque
Camera TBC
Wifi card Intel 8265
USB-TTL cable PL2303HX chip
Battery 48V 30ah

Datasets

Latvia 1118 anotated images 7,442 weed images (8x species), 411 crop images (6x crops)

960 unique plants belonging to 12 species at several growth stages

V1 Plant seedings 1.8GB

V2 Plant seedlings 2GB

5000 seedling images

Cropdeep seems useful, but data not available?

CFWD 242 annotated images

1300 images of sesame/weeds

1700 images of weeds native to Autralia

Features:

  • Fully ROS2 compatible
  • Battery Operated
  • Runtime upto 8-10 hours
  • Robotics Arm based weed removal
  • Weed detection accuracy upto 85%
  • Easy to Operate

Packages

In this section we will install all the necessary dependencies in order to be able to launch nindamani robot:

  • nindamani_agri_robot - integrate all launch node of nindamani robot
  • rpicam_ai_interface - controlling the rpi camera with AI interface
  • servo_control - controlling the servo motors with ROS2 interface
  • stepper_control - controlling the multiple stepper motors with ROS2 interface

Installation on Jetson Nano Dev Kit

1. Ubuntu

2. ROS2 (Foxy)

3. ROS2-Tensorflow

4. Depthai

5. Arduino

5. Wifi

Create ROS2 Workspace

  • follow this steps:
  mkdir -p ~/nindamani_ws/src
  cd ~/ros2_mara_ws
  colcon build
  cd src
  git clone https://github.com/AutoRoboCulture/nindamani-the-weed-removal-robot.git

Clone the Mask R-CNN GitHub Repository:

  1. Code: git clone https://github.com/BupyeongHealer/Mask_RCNN_tf_2.x
  2. Copy this cloned repo to rpicam_ai_interface package: cp Mask_RCNN rpicam_ai_interface/.
  3. Run command:
    • cd rpicam_ai_interface/Mask_RCNN
    • sudo python3 install setup.py
  4. Confirm the Library Was Installed: pip3 show mask-rcnn

Download preTrained Model weights

  • Link for MASK-RCNN preTrained model
  • Copy preTrained weights to rpicam_ai_interface package:
    mkdir rpicam_ai_interface/preTrained_weights
    cp mask_rcnn_trained_weed_model.h5 rpicam_ai_interface/preTrained_weights/.
    

Follow Folder Structure:

nindamani_ws
├── build
├── install
├── log
└── src
  ├── nindamani_agri_robot
  │   ├── launch
  │   └── scripts
  ├── rpicam_ai_interface
  │   ├── scripts
  │   ├── preTrained_weights
  │   └── Mask-RCNN
  ├── servo_control
  │   ├── config
  │   ├── scripts
  │   └── srv
  └── stepper_control
      ├── config
      ├── scripts
      ├── src
      └── srv

Compile nindamani_ws

  • Follow steps:
    cd nindamani_ws
    colcon build
    

Dependency

Stepper Motor library implementation on Arduino

Launch nindamani robot

  • Make sure source setup.bash in bashrc before ROS2 launch command: echo "source /home/<user-name>/nindamani_ws/install/setup.bash" >> ~/.bashrc
  • ROS2 Launch command: ros2 launch nindamani_agri_robot nindamani_agri_robot.launch.py

Demo video | Proof of Concept

IMAGE ALT TEXT HERE

Potential Improvements

We have presented the concept that how weeds can be detected from crops using Artifical Intelligence and through delta arm robot weeds are removed autonomously. It's not perfect of course as you can see in the video link but can be improved. Here are some of our ideas which can improvise this robot in future:

  • Gripper design enchancement with end tip as arrow shaped.
  • Delta arm reach can be improved with high torque stepper motor.
  • With RTK-GPS and 4 wheeled drive + 4 wheel steering implementation on robot, it will make whole robot working autonomously.
  • Need 3D mapping of land using Lidar, for finding variations in height of crops, weeds and ridge.

References

  1. Mask R-CNN for Object Detection and Segmentation
 @misc{matterport_maskrcnn_2017,
  title={Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow},
  author={Waleed Abdulla},
  year={2017},
  publisher={Github},
  journal={GitHub repository},
  howpublished={\url{https://github.com/matterport/Mask_RCNN}},
}
  1. Train Mask-RCNN model on Custom Dataset for Multiple Objects

  2. Delta Robot Simulation on Gazebo using MARA-Env

Developer's Contact Detail

Kevin Patel
Nihar Chaniyara
Email: autoroboculture@gmail.com

# Delta notes

[Inverse Kinematics](https://github.com/giridharanponnuvel/Delta-Robot-Inverse-Kinematics)

[Commercial Igus with 3x linear actuator](https://www.igus.co.uk/product/20433?artNr=DLE-DR-0001)

[Delta X1](https://store.deltaxrobot.com/products/delta-x-basic-kit) 

[TlAlexander planetary gear](https://github.com/tlalexander/brushless_robot_arm#readme)

About

Open Weeding Delta - AI based weed removal robot

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 87.3%
  • CMake 7.7%
  • C 5.0%