Describe the bug
I'm using "openai.gpt-oss-120b-1:0" with converse_stream. Cannot parse toolUser arguments when trigger toolUse
To Reproduce
Just run any toolUse that have toolUser arguments
Issue:
Stop reason: tool_use
[DEBUG TOOL INPUT RAW]: {
":822customer_identity",
Name Doe
[ERROR] JSON decode failed
Expected behavior
should return a valid json for toolUser arguments like
{"name": "john"}
Additional context
My Code:
MODEL_ID = "openai.gpt-oss-120b-1:0"
bedrock_config = Config(
region_name=AWS_REGION,
max_pool_connections=100,
retries={"max_attempts": 3, "mode": "standard"},
connect_timeout=3,
read_timeout=60,
tcp_keepalive=True
)
client = boto3.client(
"bedrock-runtime",
region_name=AWS_REGION,
config=bedrock_config,
aws_access_key_id=AWS_ACCESS_KEY,
aws_secret_access_key=AWS_SECRET_KEY
)
class AwsBedRockLLM:
@classmethod
def make_stream(
cls,
messages,
model,
system_message,
tool_list=None,
instructions=None,
agent_id=None,
call_id=None,
):
system_prompt = instructions if instructions else system_message
payload = {
"modelId": model,
"system": [{"text": system_prompt}] if system_prompt else [],
"messages": cls.convert_messages_for_bedrock(messages),
"inferenceConfig": {
"temperature": 0.5,
"maxTokens": 800,
"topP": 0.9
},
"performanceConfig": {"latency": "standard"}
}
# document here: https://docs.aws.amazon.com/bedrock/latest/userguide/inference-parameters.html
if tool_list:
payload["toolConfig"] = {
"tools": tool_list,
"toolChoice": {
"auto": {}
}
}
response = client.converse_stream(**payload)
return response
Describe the bug
I'm using "openai.gpt-oss-120b-1:0" with converse_stream. Cannot parse toolUser arguments when trigger toolUse
To Reproduce
Just run any toolUse that have toolUser arguments
Issue:
Stop reason: tool_use
[DEBUG TOOL INPUT RAW]: {
":822customer_identity",
Name Doe
[ERROR] JSON decode failed
Expected behavior
should return a valid json for toolUser arguments like
{"name": "john"}
Additional context
My Code: