MPI support #543
Unanswered
SecretiveShell
asked this question in
Q&A
MPI support
#543
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Does llama-cpp-python support distributed inference via MPI?
llama cpp has support for MPI for inference across many nodes. This allows for running bigger models. I was hoping to use this inside of llama-cpp-python
Beta Was this translation helpful? Give feedback.
All reactions