Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fine tuning on a dataset #26

Open
laiba12345 opened this issue Mar 18, 2024 · 7 comments
Open

Fine tuning on a dataset #26

laiba12345 opened this issue Mar 18, 2024 · 7 comments

Comments

@laiba12345
Copy link

  1. Do we need to add 'views' folder as root path in training config (views folder has rgba and pose folders) or do we need to add models to directory as well?
  2. How to configure model for fine tuning?
@ZexinHe
Copy link
Collaborator

ZexinHe commented Mar 21, 2024

Hi,

I'm not sure what you mean by add models to directory. To enable training, root_dirs should have a directory containing multiple folders e.g. uid1, uid2, uid3, etc. And uid1 should contain rgba, pose, and intrinsics.npy.

  • root_dir
    • uid1
      • rgba
      • pose
      • intrinsics.npy
    • uid2
      • ....

For finetuning, plz try to use this method.

def load_model_(self, cfg):

@hayoung-jeremy
Copy link

Hi @ZexinHe , thank you for the advise.
I've prepared images through Objaverse Rendering as you can see below :
image
However, not sure how to prepare those pose and intrinsics.npy.
I'm very new to AI, so please could you specify more details for training?

Also, I have no idea how to prepare thos meta_path's json files in train-sample.yaml :
image

It would be greate if you can help me.
Thanks in advance:)

@juanfraherrero
Copy link

I'm not so sure but following the code, in openlrm/datasets/base.py in line 46 expects a file path to a json with the uuids.

So i would try to create a json file with ["uid1","uid2",...]. #33

And reference by path to that json with the path

meta_path:
train: "your_path_to_json"

@hayoung-jeremy
Copy link

Thank you for your kind reply, @juanfraherrero !
I'll try it!

@hayoung-jeremy
Copy link

Hi @juanfraherrero , is it possible to finetune the pretrained LRM models with my custom small dataset?
I'm currently trying overfitting with my 100 pairs of high quality glb files, but it is zero-based training.
So I wonder if it is possible to use pretrained models mentioned on the README.md such as openlrm-mix-large-1.1 etc.
I don't see any guideline for finetuning, so asking your help.
Thank you in advance!

@juanfraherrero
Copy link

Hi @hayoung-jeremy , I tried to train some epochs, but i don't have enough vram (always cuda-out-of-memory) to even load the model.
In this issue #2 he said he used 32 A100 to train the model for 2 days.
Except you have a good GPU it will be imposible.

About the guideline, I followed the instructions in the readme, first prepare your data, then train. aBut as i said i can´t load the model so i don't know if I did it right.

Sorry! Good Luck.

@hayoung-jeremy
Copy link

hayoung-jeremy commented Apr 25, 2024

Hi @juanfraherrero , thank you for your kind reply!

I've tried fine-tuning using the base model provided by OpenLRM.
If you're interested, please take a look at this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants