Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Add DoRA: Weight-Decomposed Low-Rank Adaptation #645

Closed
pascal-pfeiffer opened this issue Mar 22, 2024 · 1 comment 路 Fixed by #709
Closed

[FEATURE] Add DoRA: Weight-Decomposed Low-Rank Adaptation #645

pascal-pfeiffer opened this issue Mar 22, 2024 · 1 comment 路 Fixed by #709
Assignees
Labels
type/feature Feature request

Comments

@pascal-pfeiffer
Copy link
Collaborator

pascal-pfeiffer commented Mar 22, 2024

馃殌 Feature

Add support for Weight-Decomposed Low-Rank Adaptation (DoRA).
"DoRA decomposes the pre-trained weight into two components, magnitude and direction, for fine-tuning, specifically employing LoRA for directional updates to efficiently minimize the number of trainable parameters. "

Paper: https://arxiv.org/abs/2402.09353
Implementation in Peft: https://github.com/huggingface/peft/releases/tag/v0.9.0

Motivation

Authors claim that DoRA is more efficient w.r.t. param count as LoRA and achieves better results throughout all tests.

@pascal-pfeiffer pascal-pfeiffer added the type/feature Feature request label Mar 22, 2024
@tmostak
Copy link

tmostak commented Apr 28, 2024

I'd like to bump this... seems DoRA yields accuracy basically on par with full fine tuning.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type/feature Feature request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants