Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LoRA support in transformers #96

Open
tanmayshishodia opened this issue Jun 27, 2024 · 1 comment
Open

LoRA support in transformers #96

tanmayshishodia opened this issue Jun 27, 2024 · 1 comment

Comments

@tanmayshishodia
Copy link

Hi I went through the praxis.transformers.StackedTransformer layer and I don't see any support for LoRA.

That said I was wondering if there was a way to add a set of new LoRA weights to an already existing paxConfig model. If you could give any example that would be great. The tutorials don't cover any such use case where we can update the model layers later.

@tanmayshishodia
Copy link
Author

#83

Refer above PR if you are looking to implement LoRA in PaxML for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant