Skip to content

bug: nemoguardrails server api streaming is not working #893

@whisper-bye

Description

@whisper-bye

Did you check docs and existing issues?

  • I have read all the NeMo-Guardrails docs
  • I have updated the package to the latest version before submitting this issue
  • (optional) I have used the develop branch
  • I have searched the existing issues of NeMo-Guardrails

Python version (python --version)

3.11

Operating system/version

15

NeMo-Guardrails version (if you must use a specific version and not the latest

0.11.0

Describe the bug

I started a server with the following command, but streaming mode is not working

python -m nemoguardrails server --config=./config --default-config-id=config

config.yml

models:
  - type: main
    engine: openai
    model: gpt-4o-mini

Steps To Reproduce

my test code

from dotenv import load_dotenv
from openai import OpenAI

load_dotenv()

client = OpenAI(
    base_url="http://localhost:8000/v1"
)

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {"role": "user", "content": "Hello"},
    ],
    stream=True,
    extra_body={
        "config_id": "config"
    }
)

for chunk in response:
    print(chunk.choices[0].delta.content)

Expected Behavior

print stream chunks

Actual Behavior

print nothing

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions