Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions docs/my-website/docs/completion/drop_params.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,14 @@ import TabItem from '@theme/TabItem';

Drop unsupported OpenAI params by your LLM Provider.

## Default Behavior

**By default, LiteLLM raises an exception** if you send a parameter to a model that doesn't support it.

For example, if you send `temperature=0.2` to a model that doesn't support the `temperature` parameter, LiteLLM will raise an exception.

**When `drop_params=True` is set**, LiteLLM will drop the unsupported parameter instead of raising an exception. This allows your code to work seamlessly across different providers without having to customize parameters for each one.

## Quick Start

```python
Expand Down
Loading