Parent
Part of #542 — Multi-provider LLM support
Depends on
#543 (OpenAI adapter), #544 (factory wiring)
What
codeframe/core/adapters/streaming_chat.py is currently hardwired to anthropic.AsyncAnthropic().messages.stream(). This is used by the ClaudeCodeAdapter and interactive session flows. Abstract it behind LLMProvider.
Current problem
streaming_chat.py directly uses:
anthropic.AsyncAnthropic().messages.stream()
- Anthropic-specific tool_call chunking format
- Anthropic-specific
extended_thinking blocks
Target state
StreamingChatAdapter takes a LLMProvider (or async_stream() callable)
- Provider-specific streaming details stay in each provider implementation
- Callers of
StreamingChatAdapter are unaffected
Files to modify
codeframe/core/adapters/streaming_chat.py
codeframe/adapters/llm/base.py — ensure stream() / async_stream() is in the ABC
codeframe/adapters/llm/anthropic.py — move streaming logic here from streaming_chat.py
codeframe/adapters/llm/openai.py — implement streaming
tests/core/adapters/test_streaming_chat.py
Notes
- Extended thinking (
anthropic.ExtendedThinking) is Anthropic-only — wrap behind a provider capability check (provider.supports("extended_thinking")) rather than removing it. If provider doesn't support it, silently skip.
- Streaming chunk format differs between providers — normalize to a common
StreamChunk datatype in base.py
- If streaming is complex to generalize, it's acceptable to make it a
# provider-specific path with clear extension points rather than forcing premature abstraction
Acceptance criteria
Parent
Part of #542 — Multi-provider LLM support
Depends on
#543 (OpenAI adapter), #544 (factory wiring)
What
codeframe/core/adapters/streaming_chat.pyis currently hardwired toanthropic.AsyncAnthropic().messages.stream(). This is used by theClaudeCodeAdapterand interactive session flows. Abstract it behindLLMProvider.Current problem
streaming_chat.pydirectly uses:anthropic.AsyncAnthropic().messages.stream()extended_thinkingblocksTarget state
StreamingChatAdaptertakes aLLMProvider(orasync_stream()callable)StreamingChatAdapterare unaffectedFiles to modify
codeframe/core/adapters/streaming_chat.pycodeframe/adapters/llm/base.py— ensurestream()/async_stream()is in the ABCcodeframe/adapters/llm/anthropic.py— move streaming logic here from streaming_chat.pycodeframe/adapters/llm/openai.py— implement streamingtests/core/adapters/test_streaming_chat.pyNotes
anthropic.ExtendedThinking) is Anthropic-only — wrap behind a provider capability check (provider.supports("extended_thinking")) rather than removing it. If provider doesn't support it, silently skip.StreamChunkdatatype inbase.py# provider-specificpath with clear extension points rather than forcing premature abstractionAcceptance criteria
streaming_chat.pyhas noimport anthropic