Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Instructions for pruning pre-trained "checkpoint" model #18

Open
henrypearce4D opened this issue Feb 12, 2024 · 2 comments
Open

Instructions for pruning pre-trained "checkpoint" model #18

henrypearce4D opened this issue Feb 12, 2024 · 2 comments

Comments

@henrypearce4D
Copy link

Hi thanks for the great code.

Please could you provide clear instructions for pruning a pre-trained INRIA "checkpoint" model .ply

First off, Checkpoints and Trained models seem to be referenced differently in LightGaussian compared the INRIA code.
Pre-trained INRIA models .ply are being referred to as a checkpoint, this is not a checkpoint in the INRIA code.
INRIA checkpoints are saved with --checkpoint_iterations 1000 and produce chkpnt1000.pth

To run pruning on a pre-trained INRIA "checkpoint" .ply the instructions for LightGaussian say;

Users can directly prune a trained 3D-GS checkpoint using the following command (default setting):

bash scripts/run_prune_finetune.sh

Should I add the arguments;

-s path-to-model-folder/

(full path to trained model folder e.g. -s datasets/big which contains /point_cloud/iteration_30000/point_cloud.ply)

And -m path-to-output-folder/
e.g.
bash scripts/run_prune_finetune.sh -s datasets/big -m datasets/small ?

In the scripts run_prune_finetune.sh and run_prune__pt_finetune.sh they reference run args for datasets.
In run_prune__pt_finetune.sh a comments says;
# This is an example script to load from ply file.
So should I use this to point directly to .ply file?

bash scripts/run_prune_pt_finetune.sh -datasets/big --start_pointcloud datasets/big/point_cloud/iteration_30000/point_cloud.ply -m datasets/small

I also tired added "big" as an argument to the script.

All these tests failed.

Any help is much appreciated!

@anton-brandl
Copy link

Hi @henrypearce4D !

I've struggled with the same problem as you did but got it working. Hopefully this is helping you to some degree:

Checkpoints and Trained models seem to be referenced differently in LightGaussian compared the INRIA code.
Pre-trained INRIA models .ply are being referred to as a checkpoint, this is not a checkpoint in the INRIA code

No, .ply files are called "pointclouds" and ".pth" files are called checkpoints here. You can start training from either of them, the different arguments are called start_pointcloud and start_checkpoint respectively.

To run pruning on a pre-trained INRIA "checkpoint" .ply [...]

Don't use the bash scripts. Take them only as an inspiration for a script you could build yourself. I'll make an example based on the inputs you suggested. Let datasets/big be the (colmap) dataset folder and path-to-output-folder/ be the folder where you want to find the pruned checkpoint. Let the input pointcloud be located at datasets/big/point_cloud/iteration_30000/point_cloud.ply. Now, try this command:

python prune_finetune.py -s datasets/big -m path-to-output-folder/ --eval --port 6401 --start_pointcloud datasets/big/point_cloud/iteration_30000/point_cloud.ply --iteration 5000 --test_iterations 5000 --save_iterations 5000 --prune_iterations 2 --prune_percent 0.66 --prune_type v_important_score --prune_decay 1 --position_lr_init 0.000005 --position_lr_max_steps 5000 --v_pow 0.1

I've also added all the default parameters here for better visibility, but it might also work out if you only use the args -s, -m, --start_pointcloud. I hope this helps!

@henrypearce4D
Copy link
Author

Hi @anton-brandl thanks I will give that a try!

This is the section where the example .ply (pointcloud) is referred to as a chpt (checkpoint) and "3D-GS checkpoint" so that was the confusion
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants