Skip to content

[4.D.4] Refactor worker_agent.py to use LLMProvider abstraction #546

@frankbria

Description

@frankbria

Parent

Part of #542 — Multi-provider LLM support

Depends on

#543 (OpenAI adapter), #544 (factory wiring)

What

worker_agent.py currently hardcodes AsyncAnthropic() directly. Replace with the LLMProvider abstraction so any provider can be used.

Current problem

# codeframe/agents/worker_agent.py — today
from anthropic import AsyncAnthropic, AuthenticationError, RateLimitError, APIConnectionError

class WorkerAgent:
    async def execute_task(self, task, model_name="claude-sonnet-4-5"):
        client = AsyncAnthropic(api_key=self.api_key)
        response = await client.messages.create(...)  # Anthropic-only

Target state

# after refactor
from codeframe.adapters.llm.base import LLMProvider, Purpose
from codeframe.adapters.llm import get_provider

class WorkerAgent:
    def __init__(self, ..., llm_provider: LLMProvider | None = None):
        self.llm_provider = llm_provider or get_provider()

    async def execute_task(self, task, ...):
        response = await self.llm_provider.async_complete(...)

Files to modify

  • codeframe/agents/worker_agent.py — replace AsyncAnthropic with LLMProvider
  • codeframe/adapters/llm/base.py — add async_complete() / async_stream() to LLMProvider ABC if not present
  • codeframe/adapters/llm/anthropic.py — implement async_complete() wrapping existing async path
  • codeframe/adapters/llm/openai.py — implement async_complete()
  • tests/integration/test_worker_agent_integration.py — update fixtures

Notes

  • Keep provider constructor param for backwards compat ("anthropic", "openai")
  • Error translation: map provider-specific errors (openai.RateLimitError) to a common LLMRateLimitError base exception in base.py so callers don't need to handle per-provider exceptions
  • The SUPPORTED_MODELS list and MODEL_PRICING dict should move to provider implementations (each knows its own models/prices)

Acceptance criteria

  • WorkerAgent has no import anthropic anywhere
  • WorkerAgent(llm_provider=OpenAIProvider(...)) works in tests
  • Existing integration tests still pass with Anthropic provider
  • New integration test passes with OpenAI provider (or mocked OpenAI)

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestphase-4Phase 4: Multi-Agent CoordinationrefactorIssues specifically associated with the refactor

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions