Skip to content

Comments

feat(openai): support citations in chat/completions streaming#907

Open
aimbit-ni wants to merge 4 commits intoprism-php:mainfrom
aimbit-ni:feat/chat-completions-citations
Open

feat(openai): support citations in chat/completions streaming#907
aimbit-ni wants to merge 4 commits intoprism-php:mainfrom
aimbit-ni:feat/chat-completions-citations

Conversation

@aimbit-ni
Copy link

Summary

  • Adds ChatCompletionsCitationsMapper to map top-level citations and search_results fields from chat/completions responses into Prism's existing Citation infrastructure
  • Updates the ChatCompletions stream handler to extract citations once per stream and pass them through on StreamEndEvent
  • Providers without search capabilities (standard OpenAI, etc.) are unaffected — citations remains null

This uses Prism's existing citation infrastructure (Citation, MessagePartWithCitations, StreamState.addCitation(), StreamEndEvent.citations) — no changes needed to core classes.

Fixes #906
Depends on #902

Test plan

  • 3 unit tests for ChatCompletionsCitationsMapper (with/without search results, empty array)
  • 2 integration tests for stream handler (with citations, without citations)
  • Full test suite passes (1366 tests, 0 failures)
  • PHPStan level 8 clean
  • Pint + Rector formatted

StreamEndEvent.usage can be null when providers don't include usage
data in their final stream chunk, causing a TypeError downstream.
Add `?? new Usage(0, 0)` fallback to emitStreamEndEvent() in all
providers missing it, matching the existing pattern in the OpenAI
stream handler.
Add an `api_format` config option to the OpenAI driver that allows
switching from the default `/responses` endpoint to `/chat/completions`.
This enables using Prism with OpenAI-compatible backends like vLLM,
LiteLLM, and LocalAI that only implement the chat/completions API.

Set `OPENAI_API_FORMAT=chat_completions` in your env to use it. Only
text, structured, and stream methods dispatch conditionally — other
modalities (embeddings, images, moderation, TTS, STT) already use
standard endpoints that work with compatible backends as-is.
…nfigured

Providers that reject unknown parameters (e.g. Perplexity via LiteLLM)
return HTTP 400 when `"tools": []` is sent. Return null instead so
Arr::whereNotNull() filters it out entirely.
Providers with integrated search capabilities (e.g. Perplexity,
You.com) return top-level `citations` and `search_results` fields
in chat/completions responses. These were previously ignored.

Add ChatCompletionsCitationsMapper to map these into Prism's existing
Citation infrastructure, and extract them once per stream in the
ChatCompletions stream handler. Citations are passed through on the
StreamEndEvent, matching the existing pattern used by the Anthropic
handler.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

chat/completions: citations from search-capable providers are lost

1 participant