Skip to content

Why doesn't llama-server distribute layers evenly across multiple GPUs of same size? #12752

Closed Unanswered
segmond asked this question in Q&A
Discussion options

You must be logged in to vote

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@devnen
Comment options

@segmond
Comment options

@devnen
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants