-
-
Notifications
You must be signed in to change notification settings - Fork 4.7k
Open
Labels
Description
What happened?
Model: Claude Sonnet 4.5 (and others)
Problem:
When we send an image in a tool response to Anthropic or Anthropic on Vertex AI, it works fine.
But when we send it to same Anthropic model on AWS Bedrock, LiteLLM Proxy seems to remove it.
Example Code:
import openai
client = openai.OpenAI(
api_key=LITELLM_PROXY_API_KEY,
base_url=LITELLM_PROXY_ENDPOINT
)
# WORKS ON ANTHROPIC
#MODEL="anthropic/claude-sonnet-4-5-20250929"
# DOES NOT WORK ON BEDROCK WITH SAME MODEL
# SAYS IT DOES NOT SEE IMAGE OR HALLUCINATES CONTENTS
MODEL="bedrock/global.anthropic.claude-sonnet-4-5-20250929-v1:0"
# test tool
tools=[{
"type": "function",
"function": {
"name": "get_image",
"description": "Returns an image to you",
"parameters": {
"type": "object",
"properties": {}
}
}
}]
# user prompt
messages=[{
"role": "user",
"content": "please use tool get_image and describe the image contents"
}]
# send request
response = client.chat.completions.create(
model=MODEL,
tools=tools,
messages=messages
)
# append tool call request
messages.append(response.choices[0].message)
# tool calls made by llm
tool_calls = response.choices[0].message.tool_calls
# append tool call response with image
messages.append({
"tool_call_id": tool_calls[0].id,
"role": "tool",
"name": tool_calls[0].function.name,
"content": [{
"type": "image_url",
"image_url": {
"url": "https://upload.wikimedia.org/wikipedia/commons/3/3f/JPEG_example_flower.jpg"
}
}],
})
# send tool response
response = client.chat.completions.create(
model=MODEL,
tools=tools,
messages=messages
)
# print assistant response
print(response.choices[0].message)
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v.1.78.5
Twitter / LinkedIn details
No response