Is there an existing issue for the same bug?
Describe the bug and reproduction steps
CLI v1.14.0 fails immediately when using Anthropic models via LiteLLM proxy. The error indicates both temperature and top_p are being sent, which Anthropic does not allow.
Steps to reproduce:
- Install CLI v1.14.0:
uvx openhands@1.14.0
- Configure model via LiteLLM proxy (e.g.,
litellm_proxy/prod/claude-opus-4-5-20251101)
- Start a conversation and send any message
- Error occurs immediately
Error message:
Conversation Error
Code: LLMBadRequestError
Detail:
litellm.BadRequestError: AnthropicException - {
"type": "error",
"error": {
"type": "invalid_request_error",
"message": "`temperature` and `top_p` cannot both be specified for this model.
Please use only one."
}
}
Workaround: Delete ~/.openhands/agent_settings.json and re-run openhands login to regenerate with current SDK defaults.
Root Cause
Two contributing factors have been identified:
-
Stale agent_settings.json: Users who configured CLI with older SDK versions (pre-v1.12.0) have temperature: 0.0 and top_p: 1.0 persisted in their settings. These were SDK defaults that were serialized when saving the agent config.
-
LiteLLM model recognition: For proxy models like litellm_proxy/prod/claude-opus-4-5-20251101, the SDK strips litellm_proxy/ but LiteLLM doesn't recognize prod/claude-opus-4-5-20251101 as a valid model path. This causes supports_reasoning_effort to return False, so temp/top_p aren't stripped from the request.
See upstream issue for full investigation: OpenHands/software-agent-sdk#2686
OpenHands Installation
CLI (via uvx)
OpenHands Version
1.14.0 (regression from 1.13.x)
Model Name
litellm_proxy/prod/claude-opus-4-5-20251101 (Anthropic Claude Opus 4.5 via LiteLLM proxy)
Operating System
Linux
This issue was created by an AI assistant (OpenHands) on behalf of the user.
Is there an existing issue for the same bug?
Describe the bug and reproduction steps
CLI v1.14.0 fails immediately when using Anthropic models via LiteLLM proxy. The error indicates both
temperatureandtop_pare being sent, which Anthropic does not allow.Steps to reproduce:
uvx openhands@1.14.0litellm_proxy/prod/claude-opus-4-5-20251101)Error message:
Workaround: Delete
~/.openhands/agent_settings.jsonand re-runopenhands loginto regenerate with current SDK defaults.Root Cause
Two contributing factors have been identified:
Stale
agent_settings.json: Users who configured CLI with older SDK versions (pre-v1.12.0) havetemperature: 0.0andtop_p: 1.0persisted in their settings. These were SDK defaults that were serialized when saving the agent config.LiteLLM model recognition: For proxy models like
litellm_proxy/prod/claude-opus-4-5-20251101, the SDK stripslitellm_proxy/but LiteLLM doesn't recognizeprod/claude-opus-4-5-20251101as a valid model path. This causessupports_reasoning_effortto returnFalse, so temp/top_p aren't stripped from the request.See upstream issue for full investigation: OpenHands/software-agent-sdk#2686
OpenHands Installation
CLI (via uvx)
OpenHands Version
1.14.0 (regression from 1.13.x)
Model Name
litellm_proxy/prod/claude-opus-4-5-20251101 (Anthropic Claude Opus 4.5 via LiteLLM proxy)
Operating System
Linux
This issue was created by an AI assistant (OpenHands) on behalf of the user.