Skip to content

Commit 95af700

Browse files
Add EOL message to the server (#2068)
Add deprecation notice about vllm-fork EOL to api_server. Signed-off-by: Paweł Olejniczak <[email protected]>
1 parent 4d9e757 commit 95af700

File tree

1 file changed

+11
-0
lines changed

1 file changed

+11
-0
lines changed

vllm/entrypoints/openai/api_server.py

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -110,6 +110,17 @@
110110
# Cannot use __name__ (https://github.com/vllm-project/vllm/pull/4765)
111111
logger = init_logger('vllm.entrypoints.openai.api_server')
112112

113+
# Deprecation notice
114+
logger.warning(
115+
"Starting from v1.23.0, the vLLM fork will reach end-of-life (EOL) and be "
116+
"deprecated in v1.24.0, remaining functional only for legacy use cases "
117+
"until then. \nAt the same time, the vllm-gaudi plugin will be "
118+
"production-ready in v1.23.0 and will become the default by v1.24.0.\n"
119+
"This plugin integrates Intel Gaudi with vLLM for optimized LLM inference "
120+
"and is intended for future deployments.\nWe strongly suggest preparing a "
121+
"migration path toward the plugin version: https://github.com/vllm-project/vllm-gaudi"
122+
)
123+
113124
_running_tasks: set[asyncio.Task] = set()
114125

115126

0 commit comments

Comments
 (0)