Skip to content

Conversation

@navgarcha
Copy link

@navgarcha navgarcha commented Oct 17, 2025

Problem

Using LangChainCallbackHandler (@posthog/ai/langchain) for LLM analysis the token usage is missing when a generation uses OpenAI's responses API.

Changes

LangChain uses the key estimatedTokenUsage for the responses API on llmOutput (vs chat completions API that is currently supported via the tokenUsage key) as seen here. This patch includes this new key in the array of keys to pull this information from.

Release info Sub-libraries affected

Libraries affected

  • All of them
  • posthog-js (web)
  • posthog-js-lite (web lite)
  • posthog-node
  • posthog-react-native
  • @posthog/react
  • @posthog/ai
  • @posthog/nextjs-config

@vercel
Copy link

vercel bot commented Oct 17, 2025

@navgarcha is attempting to deploy a commit to the PostHog Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 file reviewed, no comments

Edit Code Review Agent Settings | Greptile

@posthog-bot
Copy link
Collaborator

This PR hasn't seen activity in a week! Should it be merged, closed, or further worked on? If you want to keep it open, post a comment or remove the stale label – otherwise this will be closed in another week.

@navgarcha
Copy link
Author

Anyone from posthog able to review? 🙏

@posthog-bot posthog-bot removed the stale label Oct 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants