Metadata Not Passed to Callbacks in Anthropic Messages with Langfuse Integration #15596
Unanswered
ravitejaseelam
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Description
We have recently integrated Langfuse into our setup, and we’re facing an issue where metadata passed to
anthropic_messages
is not being propagated to callbacks.This is impacting our ability to capture observability and tracing data effectively.
Context
The same implementation works perfectly fine when using the
/chat/completions
endpoint.However, when switching to the
/v1/messages
endpoint (for Anthropic models), the metadata is not being forwarded to callbacks.As a temporary workaround, we tried proxying Anthropic through the OpenAI client, but that introduced several unrelated issues (outside LiteLLM) that are more severe and difficult to manage.
Example Request
Observed Behavior
As seen in the above request, the
metadata
field is present and correctly structured.However, it does not get passed to callbacks when using the
/v1/messages
endpoint with Anthropic models.In contrast, the same metadata works correctly when using the
/chat/completions
endpoint.Expected Behavior
Metadata should be available in callbacks for Anthropic models (via
/v1/messages
), just as it works with the OpenAI client and/chat/completions
endpoint.Environment
ghcr.io/berriai/litellm:main-latest
1.78.0
Claude-Sonnet 3.7
,Claude-Sonnet 4
3.117.2
Request
We would really appreciate any alternative approach or an update on whether there’s a roadmap to fix this issue.
Reference
A similar issue was reported recently here:
🔗 #15103
Beta Was this translation helpful? Give feedback.
All reactions