-
Notifications
You must be signed in to change notification settings - Fork 609
Open
Labels
priority: p3Desirable enhancement or fix. May not be included in next release.Desirable enhancement or fix. May not be included in next release.status:awaiting user responsestatus:staletype: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Description
Hi, I experienced an issue when using google-genai version 1.39.0 and over. Version 1.38.0 and below works fine.
Making a request to my fastapi backend gives an error ,see trace message below.
\Lib\site-packages\httpx\_client.py", line 901, in send
raise RuntimeError("Cannot send a request, as the client has been closed.")
RuntimeError: Cannot send a request, as the client has been closed.
Environment details
- Programming language: Python
- OS: Windows/Linux
- Language runtime version: Python 3.12
- Package version: >= 1.39
Steps to reproduce
- Set up a fastapi server with streaming chat responses see snippet below
from google import genai
from google.genai import types
from google.genai.types import ModelContent, Part, UserContent
from dotenv import load_dotenv
from fastapi import APIRouter, Body, status, HTTPException, Request
from fastapi.responses import JSONResponse, StreamingResponse
import json
import os
ai_router = APIRouter()
@ai_router.post("/ai")
async def aichat(request: Request, payload: dict = Body(...)):
system_prompt = "xxxx"
try:
client = genai.Client(
api_key=os.getenv("GOOGLE_GENERATIVE_AI_API_KEY"),
)
generate_content_config = types.GenerateContentConfig(
thinking_config=types.ThinkingConfig(
thinking_budget=0,
),
max_output_tokens=40000,
system_instruction=[
types.Part.from_text(text=system_prompt),
],
)
chat_session = client.chats.create(
model="gemini-2.5-flash",
history=[],
config=generate_content_config,
)
response_obj = chat_session.send_message_stream(payload["question"])
def generate_chunks():
for chunk in response_obj:
if chunk.text is not None:
yield f"{chunk.text}"
return StreamingResponse(generate_chunks(), media_type="text/event-stream")
except Exception as e:
#logger.info(json.dumps({"status": str(e)}))
return JSONResponse(content={"success": False}, status_code=500)
Metadata
Metadata
Assignees
Labels
priority: p3Desirable enhancement or fix. May not be included in next release.Desirable enhancement or fix. May not be included in next release.status:awaiting user responsestatus:staletype: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.Error or flaw in code with unintended results or allowing sub-optimal usage patterns.