Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update utils.py to support ollama GPU usage #185

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

scott------
Copy link
Contributor

When using the code prior to this patch ollama only uses CPU. With this patch ollama will use all of the GPU available and then fallback to using CPU (RAM).

When using the code prior to this patch ollama only uses CPU.  With this patch ollama will use all of the GPU available and then fallback to using CPU (RAM).
@CLAassistant
Copy link

CLAassistant commented Jan 28, 2025

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@vvincent1234
Copy link
Contributor

Hi, I don't have this problem on my machine. I'm not sure if this change will affect users with multiple GPUs. Can you give me some relevant links to show if this is useful?

Copy link

@WKeaganW WKeaganW left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixes the forced CPU offloading while using ollama

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants