Skip to content

fix: support Amp handoff and Codex gpt-5.5#3

Open
Srini-B wants to merge 5 commits intonghyane:mainfrom
Srini-B:fix/codex-handoff-buffer-output
Open

fix: support Amp handoff and Codex gpt-5.5#3
Srini-B wants to merge 5 commits intonghyane:mainfrom
Srini-B:fix/codex-handoff-buffer-output

Conversation

@Srini-B
Copy link
Copy Markdown
Contributor

@Srini-B Srini-B commented Apr 27, 2026

Summary

Fixes Amp /handoff and newer Deep Mode compatibility for both local Codex and local Claude routing.

Codex handoff responses can arrive through the internal buffered SSE path without a final message output item, which makes Amp throw Expected message output but none found. Claude smart-mode handoff forces a tool call, and Anthropic rejects or fails that shape when extended thinking is also present. Newer Codex models such as gpt-5.5 also gate access on the advertised Codex CLI version, so the connector now reports the installed Codex CLI identity instead of a stale hard-coded one.

What changed

  • Reconstruct buffered Codex handoff output from streamed response events before returning non-streaming Responses JSON to Amp.
  • Backfill missing Codex response.completed.response.output message content from output item/content/output text events.
  • Compact synthesized Codex message content arrays so skipped content indexes cannot serialize as sparse-array null entries.
  • Strip Codex request fields unsupported by the local Codex backend, including stream_options.
  • Detect the installed codex --version, use it in the Codex CLI-style User-Agent, and stop sending the stale legacy Version header.
  • Strip Anthropic thinking when tool_choice.type is tool or any, while preserving it for compatible modes such as auto.
  • Add regression coverage for Codex handoff reconstruction, sparse content-index compaction, Codex version identity, and Anthropic forced-tool request preparation.
  • Document the Codex deep-mode and Anthropic smart-mode compatibility behavior.

Validation

  • bun run format
  • bun run check — 68 pass, 0 fail
  • Manually verified Amp /handoff works through:
    • LOCAL_CODEX deep-mode route
    • LOCAL_CLAUDE smart-mode route with create_handoff_context returning a tool_use
  • Manually verified Amp Deep Mode works with openai:gpt-5.5 after advertising the installed Codex CLI version.

@Srini-B Srini-B changed the title fix(codex): reconstruct buffered handoff output fix: support Amp handoff for Codex and Claude Apr 27, 2026
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: d2c93a9264

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread src/providers/forward.ts
@charles-chenzz
Copy link
Copy Markdown

image

unsupport stream option still appear, did it require new release or something? sorry for the ping @Srini-B

@Srini-B
Copy link
Copy Markdown
Contributor Author

Srini-B commented Apr 28, 2026

Yes, @nghyane needs to make a release, for now you can run it directly from my fork till its done @charles-chenzz

@charles-chenzz
Copy link
Copy Markdown

Yes, @nghyane needs to make a release, for now you can run it directly from my fork till its done @charles-chenzz

I will check how to run your fork, not familiar with bun ecosystem

@Srini-B
Copy link
Copy Markdown
Contributor Author

Srini-B commented Apr 28, 2026

clone my fork, switch to fix/codex-handoff-buffer-output branch, run bun i and then bun run dev.

@Srini-B Srini-B changed the title fix: support Amp handoff for Codex and Claude fix: support Amp handoff and Codex gpt-5.5 Apr 29, 2026
@charles-chenzz
Copy link
Copy Markdown

clone my fork, switch to fix/codex-handoff-buffer-output branch, run bun i and then bun run dev.

@Srini-B if I don't have google acc oauth it seems like can't use handoff
image

image

@Srini-B
Copy link
Copy Markdown
Contributor Author

Srini-B commented Apr 30, 2026

I have mine linked to it, so I haven't seen this issue, but some of the workings are core to how the AMP CLI works. There is no way getting over it without degrading its usefulness unless we have all of the three accounts linked with. Don't you have a Google account which will use your free quota from antigravity? @charles-chenzz

@charles-chenzz
Copy link
Copy Markdown

I have mine linked to it, so I haven't seen this issue, but some of the workings are core to how the AMP CLI works. There is no way getting over it without degrading its usefulness unless we have all of the three accounts linked with. Don't you have a Google account which will use your free quota from antigravity? @charles-chenzz

I have google account that can link, but I confirmed that antigravity aggressively ban account that oauth to use on other places(my friends got ban for like 3 accounts). I only have one goog account which I don't want to take any risk, so I didn't link at the moments

@Srini-B
Copy link
Copy Markdown
Contributor Author

Srini-B commented May 1, 2026

That's a valid point. Depending on usage, the account gets flagged from what I have seen. Since what we are going to use here from the CLI is going to be the Google models and not the claude models, I think the risk is low. Another way to overcome this lack of Gemini models for the usage of handoff commands: you can create a new thread and ask it to refer to your previous thread.

@charles-chenzz
Copy link
Copy Markdown

@Srini-B sorry for the ping, amp officially use gpt5.5 as default deep mode model, does the branch in ur fork repo already support this? if not, any planning to support?

@Srini-B
Copy link
Copy Markdown
Contributor Author

Srini-B commented May 5, 2026

@Srini-B sorry for the ping, amp officially use gpt5.5 as default deep mode model, does the branch in ur fork repo already support this? if not, any planning to support?

Yes it does if you're on the latest commit on my fork

@Srini-B
Copy link
Copy Markdown
Contributor Author

Srini-B commented May 7, 2026

Things will not be the same if https://ampcode.com/news/neo is enabled for your account. It will use amp credits. you can go back to old way by using amp --take-me-back

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants