Replies: 2 comments
-
|
👋 Welcome! Thanks for opening your first issue. If you'd like to take a crack at fixing it, feel free to open a pull request — otherwise, we'll take a look as soon as we can! |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Yeah, you can connect to your existing ollama server. https://github.com/NanoNets/docext/blob/main/EXT_README.md#models-with-ollama-linux-and-macos |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Does Ollama support local deployment?
Beta Was this translation helpful? Give feedback.
All reactions