fix(responses-api): apply drop_params for GPT-5 temperature validation #16545
+143
−3
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Title
fix(responses-api): apply drop_params for GPT-5
temperature validation
Relevant issues
Fixes #16090
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/directory, Adding at least 1 test is a hard requirement - see detailsmake test-unitType
🐛 Bug Fix
Changes
Problem
The Responses API (
/v1/responses) was not applying model-specific parameter validation. Whiletemperatureis listed in supported params, GPT-5 has a specific constraint (only acceptstemperature=1).The
drop_paramssetting was not being respected because the code wasn't checking model-specific restrictions.Solution
OpenAIResponsesAPIConfig.map_openai_params()to apply model-specific validation for GPT-5OpenAIGPT5Configvalidation logic from chat completions endpointdrop_params=Truecorrectly dropstemperature != 1for GPT-5/v1/chat/completionsand/v1/responsesBreaking Changes
None
Tests Added ✅