Skip to content

Conversation

@emerzon
Copy link
Contributor

@emerzon emerzon commented Jul 25, 2025

Enable metadata parameter for OpenAI API calls

Fixes issue where metadata was not being sent to OpenAI API despite being passed to litellm.completion(). The metadata field is now properly supported for OpenAI and tracked in logging integrations like Langfuse.

Relevant issues

Fixes #12997

Type

🐛 Bug Fix

Changes

  • Add metadata to OPENAI_CHAT_COMPLETION_PARAMS and DEFAULT_CHAT_COMPLETION_PARAM_VALUES
  • Add metadata to OpenAIGPTConfig.get_supported_openai_params()
  • Add metadata parameter to get_optional_params() function signature
  • Pass metadata through to optional_params in completion flow
  • Remove preview features gate for metadata support
  • Ensure metadata is available in logging for all integrations

- Add metadata to OPENAI_CHAT_COMPLETION_PARAMS and DEFAULT_CHAT_COMPLETION_PARAM_VALUES
- Add metadata to OpenAIGPTConfig.get_supported_openai_params()
- Add metadata parameter to get_optional_params() function signature
- Pass metadata through to optional_params in completion flow
- Remove preview features gate for metadata support
- Ensure metadata is available in logging for all integrations

Fixes issue where metadata was not being sent to OpenAI API despite being passed to litellm.completion(). The metadata field is now properly supported for OpenAI and tracked in logging integrations like Langfuse.
@vercel
Copy link

vercel bot commented Jul 25, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jul 25, 2025 11:25pm

@krrishdholakia
Copy link
Contributor

Hey @emerzon this was intentional - as the metadata param is also used by litellm for internal logic. We are trying to migrate away from this to litellm_metadata. Do you have a use-case today blocked because of this?

@emerzon
Copy link
Contributor Author

emerzon commented Jul 28, 2025

Hi @krrishdholakia, yes, we're trying to setup Session tracking in Langfuse, and it does require this feature: https://langfuse.com/docs/observability/features/sessions

@delacroixe
Copy link

We are trying to send metadata to LangFuse to add more information. This change would be great to achieve our goal.

@hnt2601
Copy link

hnt2601 commented Sep 30, 2025

Hey @emerzon this was intentional - as the metadata param is also used by litellm for internal logic. We are trying to migrate away from this to litellm_metadata. Do you have a use-case today blocked because of this?

same

@pbebbo
Copy link

pbebbo commented Oct 9, 2025

Hey @emerzon, do you know when this bug will be fixed?

@chrisk314
Copy link

Hi, I'm wondering what's the status of this issue? We execute workflows consisting of hundreds of LLM requests in a B2B app. We use metadata to efficiently filter the traces in Arize Phoenix. This works well using LlamaIndex but with LiteLLM we lose the metadata making it impractical to work with the traces.

@emerzon
Copy link
Contributor Author

emerzon commented Oct 22, 2025

Hi @krrishdholakia, any visibility on this issue?

@pbebbo
Copy link

pbebbo commented Nov 5, 2025

@krrishdholakia and @emerzon any updates on this?

@emerzon
Copy link
Contributor Author

emerzon commented Nov 14, 2025

Closing in favor of #16523

@emerzon emerzon closed this Nov 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: LiteLLM SDK Client does not send out metadata upstream

6 participants