Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

硬件要求及python环境:Qwen2-VL-2B 部署与微调 #375

Open
InTheFuture7 opened this issue Mar 4, 2025 · 2 comments
Open

硬件要求及python环境:Qwen2-VL-2B 部署与微调 #375

InTheFuture7 opened this issue Mar 4, 2025 · 2 comments

Comments

@InTheFuture7
Copy link

@Zeyi-Lin @2710932616 @Joe-2002 感谢提供的优质教程,关于教程中的 Qwen2-VL-2B 部署与微调,有两个细节想请教一下:

硬件兼容性:请问使用 RTX 3060 显卡能否满足以下任务需求?(教程中未提及具体硬件要求,故有此问)

  1. Qwen2-VL-2B-Instruct WebDemo 部署
  2. Qwen2-VL-2B Lora 微调
  3. Qwen2-VL-2B-Instruct Lora 微调 + SwanLab 可视化

环境管理:建议为这三个部分分别创建独立的 Python 虚拟环境吗?其中,Lora 微调部分教程中没有讲到 Python的具体版本

@2710932616
Copy link
Contributor

应该是都满足的,只要你的max token length不要设置太大。

参考这里: https://huggingface.co/spaces/Vokturz/can-it-run-llm
Image

@InTheFuture7
Copy link
Author

好的,感谢 @2710932616

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants