You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It can be useful to provide a seed value for reproducible outputs when tweaking prompts and other parameters. The Ollama API supports this via a seed option along with the normal options such as temperature and top_k:
curl http://localhost:11434/api/generate -d '{ "model": "llama3.2:latest", "prompt": "Give me the name of a random fruit?", "stream": false, "options": { "seed": 123 }}'| jq '.response'# Output is the same for every call"The random fruit I've chosen is... Guanabana (also known as Soursop)!"
seed is currently not supported in config:
and even when typecast to any is ignored in the request as the result given from the model is different
constllmResponse=awaitgenerate({model: 'ollama/llama3.2:latest',prompt: 'Give me the name of a random fruit?',config: <any>{seed: 123,},
});console.log(awaitllmResponse.text());// The fruit is: Papaya// (different with every call)
It would be good to support seed in the config parameter when calling generate() for Ollama along with any other plugins that would support it.
The text was updated successfully, but these errors were encountered:
It can be useful to provide a
seed
value for reproducible outputs when tweaking prompts and other parameters. The Ollama API supports this via aseed
option along with the normaloptions
such astemperature
andtop_k
:seed
is currently not supported in config:and even when typecast to
any
is ignored in the request as the result given from the model is differentIt would be good to support
seed
in theconfig
parameter when callinggenerate()
for Ollama along with any other plugins that would support it.The text was updated successfully, but these errors were encountered: