-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ScanNet Dataset #35
Comments
Sounds great! I am not sure how exactly the best way to share the dataset between two tasks is, but I think for now it can be implemented on the semantic seg task and have a parameter in there for using the dataset in "instance" mode, then once we have instance segmentation models we can refactor it as we see fit. IIRC scannet is a complicated dataset to preprocess, which has led me to avoiding it in the past. I know torch-points3d had a scan-net dataset: https://github.com/torch-points3d/torch-points3d/blob/master/torch_points3d/datasets/segmentation/scannet.py but I'm not sure in what state it's in. Open3d-ml also seems to have an implementation: https://github.com/isl-org/Open3D-ML/blob/master/ml3d/datasets/scannet.py https://github.com/isl-org/Open3D-ML/blob/8ddb67206e4fef55b39eea691ff00d49cef18be5/scripts/preprocess_scannet.py I think this would be a good candidate for preprocessing and storing in Torch-geometric format (see https://pytorch-geometric.readthedocs.io/en/latest/notes/create_dataset.html#creating-larger-datasets), just like torch-points3d does it. If you're interested in implementing it, I think using the torch-points3d dataset as a base would be a good start. |
Thanks @CCInc . I'll start implementing Scannet data loader. Assign this issue to me |
Awesome @jaswanthbjk! |
http://www.scan-net.org/
ScanNet is one of the most used 3D datasets for Semantic, Instance segmentation.
The Dataloader should support both tasks. Maybe Semantic can be inherited to produce an Instance segmentation loader.
@CCInc or @leo-stan any suggestions?
The text was updated successfully, but these errors were encountered: