Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Having issues on Running Llama-3-70B #962

Open
BedirT opened this issue May 10, 2024 · 1 comment
Open

Having issues on Running Llama-3-70B #962

BedirT opened this issue May 10, 2024 · 1 comment

Comments

@BedirT
Copy link

BedirT commented May 10, 2024

Hi,

I am trying to run fine-tuning Llama-3-70B-Instruct on Sagemaker Notebook with p4de.24xlarge instance. I am not sure whats wrong, but it seems like package cannot see torchtune.models.llama3.lora_llama3_70b module.

The command I am running is:

tune run --nproc_per_node 8 lora_finetune_distributed --config Llama-3-70B-lora.yaml

I installed the torchtune using pip and here are my torch versions:

torch==2.2.2
torchao==0.1
torchaudio==2.2.2
torchtune==0.1.1
torchvision==0.17.2

Since Sagemaker environments are handled by Conda, I am also using a Conda environment in my setup. The error I get:

AttributeError: module 'torchtune.models.llama3' has no attribute 'lora_llama3_70b'. Did you mean: 'lora_llama3_8b'?
@kartikayk
Copy link
Contributor

Unfortunately this feature is not in our stable package release. Can you install the nightly build and see if it fixes the issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants