fix: support Amp handoff and Codex gpt-5.5#3
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: d2c93a9264
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
unsupport stream option still appear, did it require new release or something? sorry for the ping @Srini-B |
|
Yes, @nghyane needs to make a release, for now you can run it directly from my fork till its done @charles-chenzz |
I will check how to run your fork, not familiar with bun ecosystem |
|
clone my fork, switch to |
@Srini-B if I don't have google acc oauth it seems like can't use handoff
|
|
I have mine linked to it, so I haven't seen this issue, but some of the workings are core to how the AMP CLI works. There is no way getting over it without degrading its usefulness unless we have all of the three accounts linked with. Don't you have a Google account which will use your free quota from antigravity? @charles-chenzz |
I have google account that can link, but I confirmed that antigravity aggressively ban account that oauth to use on other places(my friends got ban for like 3 accounts). I only have one goog account which I don't want to take any risk, so I didn't link at the moments |
|
That's a valid point. Depending on usage, the account gets flagged from what I have seen. Since what we are going to use here from the CLI is going to be the Google models and not the claude models, I think the risk is low. Another way to overcome this lack of Gemini models for the usage of handoff commands: you can create a new thread and ask it to refer to your previous thread. |
|
@Srini-B sorry for the ping, amp officially use gpt5.5 as default deep mode model, does the branch in ur fork repo already support this? if not, any planning to support? |
Yes it does if you're on the latest commit on my fork |
|
Things will not be the same if https://ampcode.com/news/neo is enabled for your account. It will use amp credits. you can go back to old way by using |



Summary
Fixes Amp
/handoffand newer Deep Mode compatibility for both local Codex and local Claude routing.Codex handoff responses can arrive through the internal buffered SSE path without a final
messageoutput item, which makes Amp throwExpected message output but none found. Claude smart-mode handoff forces a tool call, and Anthropic rejects or fails that shape when extendedthinkingis also present. Newer Codex models such asgpt-5.5also gate access on the advertised Codex CLI version, so the connector now reports the installed Codex CLI identity instead of a stale hard-coded one.What changed
response.completed.response.outputmessage content from output item/content/output text events.contentarrays so skipped content indexes cannot serialize as sparse-arraynullentries.stream_options.codex --version, use it in the Codex CLI-styleUser-Agent, and stop sending the stale legacyVersionheader.thinkingwhentool_choice.typeistoolorany, while preserving it for compatible modes such asauto.Validation
bun run formatbun run check— 68 pass, 0 fail/handoffworks through:LOCAL_CODEXdeep-mode routeLOCAL_CLAUDEsmart-mode route withcreate_handoff_contextreturning atool_useopenai:gpt-5.5after advertising the installed Codex CLI version.