Skip to content

DOPE (Deep Object Pose Estimation for Semantic Robotic Grasping of Household Objects)

Notifications You must be signed in to change notification settings

chinancheng/DOPE.pytorch

Repository files navigation

DOPE.pytorch

This is an unofficial implementation of DOPE (Deep Object Pose Estimation for Semantic Robotic Grasping of Household Objects) trained on self-created synthetic bottle dataset.

Requirement

$ conda create -n DOPE python=3.6
$ pip install -r requirement.txt
$ conda activate DOPE

Usage

Train and Eval

[WIP]

Demo

 DOPE.pytorch
   - logs
     - Jack_Daniels-checkpoint.pth
     - Jose_Cuervo-checkpoint.pth
  • Data
DOPE.pytorch
  - data
    - Real_bottle_sequence
      - 000001.jpg
      - 000002.jpg
      ...
      - _object_settings.json
      - _camera_settings.json

  • Run
python demo.py
  --path_to_data_dir ./data/Real_bottle_sequence
  --class_name Jack_Daniels
  --checkpoint ./logs/Jack_Daniels-checkpoint.pth
  --plot

Reference

About

DOPE (Deep Object Pose Estimation for Semantic Robotic Grasping of Household Objects)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages