You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi everyone,
I'm trying to use the streaming functionality with models hosted on Ollama, but none of the models I've tested are working. Has anyone managed to get this working, or are only models from the Gemini API supported?
Please check server log for full details: Live connection is not supported for the model I use.
defget_weather(city: str) ->dict:
"""Retrieves the current weather report for a specified city. Args: city (str): The name of the city for which to retrieve the weather report. Returns: dict: status and result or error msg. """ifcity.lower() =="new york":
return {
"status": "success",
"report": (
"The weather in New York is sunny with a temperature of 25 degrees"" Celsius (77 degrees Fahrenheit)."
),
}
else:
return {
"status": "error",
"error_message": f"Weather information for '{city}' is not available.",
}
defget_current_time(city: str) ->dict:
"""Returns the current time in a specified city. Args: city (str): The name of the city for which to retrieve the current time. Returns: dict: status and result or error msg. """ifcity.lower() =="new york":
tz_identifier="America/New_York"else:
return {
"status": "error",
"error_message": (f"Sorry, I don't have timezone information for {city}."),
}
tz=ZoneInfo(tz_identifier)
now=datetime.datetime.now(tz)
report=f"The current time in {city} is {now.strftime('%Y-%m-%d %H:%M:%S %Z%z')}"return {"status": "success", "report": report}
root_agent=Agent(
name="agent",
model=LiteLlm(model="ollama_chat/qwen3:8b", streaming=True),
description="Agent to answer questions about the time and weather in a city.",
instruction="You are a helpful agent who can answer user questions about the time and weather in a city in a human format. Be polite.",
tools=[get_weather, get_current_time],
)
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
I'm trying to use the streaming functionality with models hosted on Ollama, but none of the models I've tested are working. Has anyone managed to get this working, or are only models from the Gemini API supported?
Please check server log for full details: Live connection is not supported for the model I use.
Beta Was this translation helpful? Give feedback.
All reactions