Skip to content

AI provider integration uses raw fetch() — no native SDK support #7

@morozow

Description

@morozow

Context

The original CompanionAgent example communicates with OpenAI via raw fetch() calls to https://api.openai.com/v1/chat/completions. There are no provider implementations for Anthropic or Google Gemini.

Problem

  1. Manual HTTP handling. Error parsing, response normalization, header management, and retry logic are all hand-written. Native SDKs handle this automatically.
  2. No Anthropic or Gemini support. Users who want Claude or Gemini must write their own integration from scratch, including format conversion (Anthropic's top-level system parameter, Gemini's Content parts format).
  3. Inconsistent error handling. Raw fetch errors are generic — there is no distinction between auth failures, rate limits, network errors, and timeouts. MCP clients receive opaque error messages.
  4. No token usage reporting. The raw fetch approach does not extract or normalize token usage from provider responses.

Expected behavior

  • Built-in providers for OpenAI, Anthropic, and Google Gemini using their official native SDKs.
  • Normalized response format (text, stopReason, usage) regardless of which provider generated it.
  • Typed error classification (AUTH, UPSTREAM, TRANSPORT, TIMEOUT) for all providers.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions