Skip to content

docs: add DoRA to roadmap#1124

Open
GeorgePearse wants to merge 1 commit intomainfrom
feat/roadmap-add-dora
Open

docs: add DoRA to roadmap#1124
GeorgePearse wants to merge 1 commit intomainfrom
feat/roadmap-add-dora

Conversation

@GeorgePearse
Copy link
Collaborator

Summary

  • Add DoRA (Weight-Decomposed Low-Rank Adaptation) to the roadmap under "Other Planned Integrations"
  • DoRA is an improved version of LoRA that decomposes weights into magnitude and direction components
  • Published as ICML 2024 Oral (top 1.5% acceptance rate)

Why DoRA?

DoRA offers several advantages over LoRA for parameter-efficient fine-tuning:

  • Better performance especially at lower ranks (can often halve rank requirements)
  • No inference overhead
  • Enhanced training stability
  • Already integrated into HuggingFace PEFT

References

🤖 Generated with Claude Code

Add DoRA (Weight-Decomposed Low-Rank Adaptation) to the Other Planned
Integrations section. DoRA improves upon LoRA by decomposing weights
into magnitude and direction components, offering better performance
especially at lower ranks with no inference overhead.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant