I don’t have the ability to file issues directly on your behalf, but I can give you a clean, ready-to-submit report that will get traction with the OpenAI SDK team.
Here’s a concise bug report you can paste into GitHub:
⸻
📌 Title
Java SDK: prompt_cache_retention uses "in-memory" instead of "in_memory" (API mismatch)
⸻
🐛 Description
The latest openai-java SDK exposes prompt_cache_retention with enum value:
"in-memory"
However, the OpenAI API expects:
"in_memory"
This causes a mismatch between SDK serialization and API contract.
⸻
✅ Expected behavior
SDK should send:
"prompt_cache_retention": "in_memory"
⸻
❌ Actual behavior
SDK sends:
"prompt_cache_retention": "in-memory"
Which may:
- Be rejected by the API, or
- Be ignored (falling back to default behavior)
⸻
🔁 Reproduction
ResponseCreateParams params = ResponseCreateParams.builder()
.model(ChatModel.GPT_5_2)
.input("Test prompt")
.promptCacheRetention("in-memory") // only allowed by SDK
.build();
⸻
🔎 Impact
- Prevents correct usage of extended prompt caching (in_memory vs 24h)
- Forces users to bypass SDK or use raw HTTP
- Breaks alignment with documented API behavior
⸻
💡 Suggested fix
- Update enum/string value to "in_memory"
- OR map "in-memory" → "in_memory" during serialization
- Ensure consistency across all SDKs
⸻
📚 References
- API docs: prompt caching uses "in_memory"
- SDK currently enforces "in-memory"
⸻
🛠 Workarounds
- Use raw HTTP request
- Inject raw string if SDK allows
- Use providerData / passthrough fields
⸻
Where to submit
Go here:
👉 https://github.com/openai/openai-java/issues
Click “New Issue” and paste the above.
⸻
If you want, I can tighten this further (e.g., include SDK version, exact class path, or failing HTTP trace) to increase the chance it gets fixed quickly.
I don’t have the ability to file issues directly on your behalf, but I can give you a clean, ready-to-submit report that will get traction with the OpenAI SDK team.
Here’s a concise bug report you can paste into GitHub:
⸻
📌 Title
Java SDK: prompt_cache_retention uses "in-memory" instead of "in_memory" (API mismatch)
⸻
🐛 Description
The latest openai-java SDK exposes prompt_cache_retention with enum value:
"in-memory"
However, the OpenAI API expects:
"in_memory"
This causes a mismatch between SDK serialization and API contract.
⸻
✅ Expected behavior
SDK should send:
"prompt_cache_retention": "in_memory"
⸻
❌ Actual behavior
SDK sends:
"prompt_cache_retention": "in-memory"
Which may:
⸻
🔁 Reproduction
ResponseCreateParams params = ResponseCreateParams.builder()
.model(ChatModel.GPT_5_2)
.input("Test prompt")
.promptCacheRetention("in-memory") // only allowed by SDK
.build();
⸻
🔎 Impact
⸻
💡 Suggested fix
⸻
📚 References
⸻
🛠 Workarounds
⸻
Where to submit
Go here:
👉 https://github.com/openai/openai-java/issues
Click “New Issue” and paste the above.
⸻
If you want, I can tighten this further (e.g., include SDK version, exact class path, or failing HTTP trace) to increase the chance it gets fixed quickly.