We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@Zeyi-Lin @2710932616 @Joe-2002 感谢提供的优质教程,关于教程中的 Qwen2-VL-2B 部署与微调,有两个细节想请教一下:
硬件兼容性:请问使用 RTX 3060 显卡能否满足以下任务需求?(教程中未提及具体硬件要求,故有此问)
环境管理:建议为这三个部分分别创建独立的 Python 虚拟环境吗?其中,Lora 微调部分教程中没有讲到 Python的具体版本
The text was updated successfully, but these errors were encountered:
应该是都满足的,只要你的max token length不要设置太大。
参考这里: https://huggingface.co/spaces/Vokturz/can-it-run-llm
Sorry, something went wrong.
好的,感谢 @2710932616
No branches or pull requests
@Zeyi-Lin @2710932616 @Joe-2002 感谢提供的优质教程,关于教程中的 Qwen2-VL-2B 部署与微调,有两个细节想请教一下:
硬件兼容性:请问使用 RTX 3060 显卡能否满足以下任务需求?(教程中未提及具体硬件要求,故有此问)
环境管理:建议为这三个部分分别创建独立的 Python 虚拟环境吗?其中,Lora 微调部分教程中没有讲到 Python的具体版本
The text was updated successfully, but these errors were encountered: