Skip to content

Add support for logprobs in Google's Gemini 2.5/ 2.0 models #5418

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
trulyronak opened this issue Mar 27, 2025 · 3 comments
Open

Add support for logprobs in Google's Gemini 2.5/ 2.0 models #5418

trulyronak opened this issue Mar 27, 2025 · 3 comments
Labels
ai/provider enhancement New feature or request

Comments

@trulyronak
Copy link

Feature Description

Hey Vercel team - I'm requesting you add logprobs support for Google's Gemini models in the AI SDK. This is especially important now that Gemini 2.5 has been released.

Looking at the AI SDK docs, I can see logprobs is a standard capability for some providers, but it's not currently implemented for the Google provider. The Google Generative AI API already supports this by letting developers set responseLogprobs: true in requests, plus an optional logprobs parameter to specify how many candidate tokens you get back.

What we need:

  1. A way to pass responseLogprobs: true in our AI SDK requests when using Gemini
  2. Access to the logprobsCandidates that come back in the response
  3. Also expose the avgLogprobs field that Google returns by default

Ideally these would be accessible via provider metadata in both requests and responses, similar to how it's handled for other providers (like 4o mini)

Use Cases

In my company, we need logprobs for:

  • Filtering out low-confidence responses (essential when the model is unsure)
  • Better token selection logic for more sensitive applications
  • Benchmarking different models on the same tasks
  • Building pipelines that use confidence scores for downstream decision-making
  • Detecting when to fall back to human review instead of using model output

Additional context

This is especially frustrating with Gemini 2.5 now released. We're currently either using OpenAI exclusively through your SDK (missing out on Gemini's capabilities), or maintaining two separate code paths - Vercel SDK for most things, then Google's SDK specifically for the logprobs functionality.

There's an existing issue (#4969) that's been open since before Gemini 2.0. With 2.5 now available with its improved capabilities, having this feature would let smaller teams like mine stay on a single SDK rather than juggling multiple vendor libraries.

It would be awesome if you could prioritize this - it seems like a straightforward addition given that the underlying Google API already supports it!

@trulyronak trulyronak added the enhancement New feature or request label Mar 27, 2025
@GurneeshBudhiraja
Copy link

GurneeshBudhiraja commented Apr 2, 2025

Hi @lgrammel, sorry to tag you in this.
I am working on this and would want a little guidance and clarification about the solution I am thinking to implement.

The Google vertex ai provider is using Google generative AI provider under the hood which does not support the log probs feature as of now. Therefore, below are my two proposed solutions which I think would fix the issue:

  1. Whether to create separate files for the Vertex provider and use them instead of the files from the Generative AI. This way I think the dependency of the vertex provider would move from google generative provider code to its own code files.
  2. Or, is it better to add an optional key in the createVertex provider like below. And then I could modify the endpoint or any other code in the google generative provider accordingly.
const response = await generateText({
    model: vertex("gemini-1.5-flash", {
      vertexAI: true, // optional key or it could be appended, any time logProbs have been requested by the user.
    }),

Thanks and looking forward to your feedback!😃

@chancharikmitra
Copy link

Don't mean to divert the discussion here, but are you all sure that Gemini is supporting logprobs in the newer versions? Refer to: google-gemini/deprecated-generative-ai-python#238 (comment)

@GurneeshBudhiraja
Copy link

Don't mean to divert the discussion here, but are you all sure that Gemini is supporting logprobs in the newer versions? Refer to: google-gemini/deprecated-generative-ai-python#238 (comment)

As far as I know Vertex AI currently do support log probs for 2 models
Refer this:

https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/inference#logprobs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/provider enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants