Inference consumes all RAM but GPU RAM is not used #242
Unanswered
valentinitnelav
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
This is what I have tried:
I used
watch -d -n 0.5 nvidia-smi
to monitor the GPU usage and it looks like it's RAM is not being used actually. TheGPU Memory Usage
stays constantly at1490MiB
while the general 64 OS RAM is consumed in few minutes (as shown byhtop
command):Output of
htop
:Could you help me understand where the bug is?
Beta Was this translation helpful? Give feedback.
All reactions