Skip to content

Conversation

@allozaur
Copy link
Owner

No description provided.

Copy link
Owner Author

@allozaur allozaur left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Few things to address, plus:

  • Make sure we have still proper llama-swap compatibility
  • Unify imports for $lib/utils and $lib/types for cleaner code
  • Add proper documentation of the WebUI architecture and functionalities to address ggml-org#16256

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move to updated README in tools/server docs

@ggerganov
Copy link

I observe the following issue:

  • Start a server with: ./bin/llama-server --port 8033
  • Load gpt-oss-20b from the model selector
  • In new chat and input something (e.g. "test")
  • After model responds, stop the model by clicking the red power off button in the model selector
  • Now load it again and try to enter a second message
  • I get this error:
image

I think there is some bug with the command construction because I can see the process is started but it has two --port arguments:

ps ax | grep llama-server

92161 s000  S+     0:01.60 ./bin/llama-server --port 62516 -hf ggml-org/gpt-oss-20b-GGUF --alias ggml-org/gpt-oss-20b-GGUF --port 62495

@ngxson
Copy link

ngxson commented Nov 30, 2025

@ggerganov It should be fixed in my last commit from ggml-org#17470

@allozaur probably we no longer have to publish a PR anymore, because I regularly git merge your branch into mine. Or maybe you can just push directly to my PR: ggml-org#17470

@allozaur
Copy link
Owner Author

@ggerganov It should be fixed in my last commit from ggml-org#17470

@allozaur probably we no longer have to publish a PR anymore, because I regularly git merge your branch into mine. Or maybe you can just push directly to my PR: ggml-org#17470

Yeah, I today in the morning saw u had merged my changes. I mainly wanted to keep this as a separate PR for the time of working out the architecture, but from now I think I will be good with just pushing to your branch.

@allozaur allozaur closed this Nov 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants