Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[JS] Support seed for Ollama generation #1170

Open
zbarbuto opened this issue Nov 4, 2024 · 0 comments
Open

[JS] Support seed for Ollama generation #1170

zbarbuto opened this issue Nov 4, 2024 · 0 comments

Comments

@zbarbuto
Copy link

zbarbuto commented Nov 4, 2024

It can be useful to provide a seed value for reproducible outputs when tweaking prompts and other parameters. The Ollama API supports this via a seed option along with the normal options such as temperature and top_k:

curl http://localhost:11434/api/generate -d '{
  "model": "llama3.2:latest",
  "prompt": "Give me the name of a random fruit?",
  "stream": false,
  "options": {
    "seed": 123
  }
}' | jq '.response'

# Output is the same for every call
"The random fruit I've chosen is... Guanabana (also known as Soursop)!"

seed is currently not supported in config:

image

and even when typecast to any is ignored in the request as the result given from the model is different

  const llmResponse = await generate({
    model: 'ollama/llama3.2:latest',
    prompt: 'Give me the name of a random fruit?',
    config: <any>{
      seed: 123,
    },
  });

 console.log(await llmResponse.text());

// The fruit is: Papaya
// (different with every call)

It would be good to support seed in the config parameter when calling generate() for Ollama along with any other plugins that would support it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

No branches or pull requests

1 participant