Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

本地只有4张4090显卡是否可以使用LoRA训练LLama 3.1 8b的模型 #1896

Open
yaphet266 opened this issue Oct 21, 2024 · 1 comment
Assignees

Comments

@yaphet266
Copy link

我现在只有4张4090显卡,单卡显存24G,总显存只有96G,是否可以训练LLama 3.1 8b的模型,我现在按照官网文档无法运行成功,请问如何调整参数可以在比较少显存的情况下进行训练

@zzjjay
Copy link
Collaborator

zzjjay commented Nov 28, 2024

使用lora训练可以训起来~
参考PaddleNLP的训练文档,https://github.com/PaddlePaddle/PaddleNLP/tree/develop/llm#23-lora

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants