Project
term-challenge
Description
The SDK's OpenAI Responses API path (_chat_openai_responses) always includes the temperature parameter in the request payload, even for GPT-5 reasoning models that don't support it. While the standard chat path correctly makes temperature optional, the Responses API path unconditionally includes it.
GPT-5 reasoning models like gpt-5.2-codex return an error:
Unsupported parameter: 'temperature' is not supported with this model.
The bug is in sdk/python/term_sdk/llm.py lines 637-643:
# Build payload
payload: Dict[str, Any] = {
"model": model,
"input": input_items,
"temperature": temperature, # ← Always sent, even for reasoning models!
"max_output_tokens": max_tokens,
"store": False,
}
Error Message
Unsupported parameter: 'temperature' is not supported with this model.
Debug Logs
System Information
Version: 0.1.0
## Operating System
OS: Ubuntu 24.04 LTS
Kernel: 6.8.0-31-generic
Arch: x86_64
## Hardware
CPU: AMD EPYC Genoa Processor
RAM: 15 GB
## Build Environment
Rust: rustc 1.92.0 (ded5c06cf 2025-12-08)
Target: x86_64
Screenshots
No response
Steps to Reproduce
- Use the SDK with OpenAI provider and GPT-5.x reasoning model:
from term_sdk import LLM
llm = LLM(provider="openai")
response = llm.ask("Hello", model="gpt-5.2-codex")
- The SDK sends temperature to the Responses API
- OpenAI returns "Unsupported parameter" error
Expected Behavior
- Temperature should only be sent if explicitly set by the user
- For models that don't support temperature, it should be omitted from the payload
- Same behavior as the standard chat path (lines 1153-1155)
Actual Behavior
- Temperature is always included in Responses API payloads
- GPT-5 reasoning models reject the request
Additional Context
No response
Project
term-challenge
Description
The SDK's OpenAI Responses API path (
_chat_openai_responses) always includes thetemperatureparameter in the request payload, even for GPT-5 reasoning models that don't support it. While the standard chat path correctly makes temperature optional, the Responses API path unconditionally includes it.GPT-5 reasoning models like
gpt-5.2-codexreturn an error:The bug is in
sdk/python/term_sdk/llm.pylines 637-643:Error Message
Unsupported parameter: 'temperature' is not supported with this model.Debug Logs
System Information
Screenshots
No response
Steps to Reproduce
Expected Behavior
Actual Behavior
Additional Context
No response