You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please update gptel first -- errors are often fixed by the time they're reported.
I have updated gptel to the latest commit and tested that the issue still exists
Bug Description
When claude 3.7 with thinking attempts a tool use after thinking, we recieve an error from the api:
{
"type": "error",
"error": {
"type": "invalid_request_error",
"message": "messages.1.content.0.type: Expected `thinking` or `redacted_thinking`, but found `text`. When `thinking` is enabled, a final `assistant` message must start with a thinking block (preceeding the lastmost set of `tool_use` and `tool_result` blocks). We recommend you include thinking blocks from previous turns. To avoid this requirement, disable `thinking`. Please consult our documentation at https://docs.anthropic.com/en/docs/build-with-claude/extended-thinking"
}
}
;; this "Claude-thinking" definition was straight from the gptel readme iirc
(gptel-make-anthropic "Claude-thinking";Any name you want:key"your-API-key":streamt:models '(claude-3-7-sonnet-20250219)
:header (lambda () `(("x-api-key".,key)
("anthropic-version"."2023-06-01")
("anthropic-beta"."pdfs-2024-09-25")
("anthropic-beta"."output-128k-2025-02-19")
("anthropic-beta"."prompt-caching-2024-07-31")))
:request-params '(:thinking (:type"enabled":budget_tokens54000)
:max_tokens64000))
;; the tool that happened to trigger this for me, but I think any tool will do it?
(gptel-make-tool
:name"query_user":function (lambda (query)
(read-string query ""))
:description"Ask the user a question. E.g. ask them for clarification, ask them to make a decision, ask for their judgment, ask them to be more specific. Returns the user's response.":args (list '(:name"user_question":type string
:description"The message to display to the user."))
:category"interaction")
Start a gptel chat session, choose the Claude-thinking model, enable tools and choose the query_user tool, set system message to none, and ask the llm:
I'm confused about MCP vs tool use for llms; once you have tool use, why do you need MCP?
On a succesful reproduction the model will call the tool, and you will have to type in a response in the minibuffer; then, you will recieve the error quoted above.
Additional Context
.
Backtrace
Log Information
{
"gptel": "request headers",
"timestamp": "2025-03-25 13:50:00"
}
{
"Content-Type": "application/json",
"x-api-key": "APIKEY",
"anthropic-version": "2023-06-01",
"anthropic-beta": "pdfs-2024-09-25"
}
{
"gptel": "request body",
"timestamp": "2025-03-25 13:50:00"
}
{
"model": "claude-3-7-sonnet-20250219",
"stream": true,
"max_tokens": 64000,
"messages": [
{
"role": "user",
"content": "I'm confused about MCP vs tool use for llms; once you have tool use, why do you need MCP?"
}
],
"temperature": 1.0,
"tools": [
{
"name": "query_user",
"description": "Ask the user a question. E.g. ask them for clarification, ask them to make a decision, ask for their judgment, ask them to be more specific. Returns the user's response.",
"input_schema": {
"type": "object",
"properties": {
"user_question": {
"type": "string",
"description": "The message to display to the user."
}
},
"required": [
"user_question"
]
}
}
],
"thinking": {
"type": "enabled",
"budget_tokens": 54000
}
}
{"gptel": "request Curl command", "timestamp": "2025-03-25 13:50:00"}
curl \--disable \--location \--silent \--compressed \-XPOST \-y300 \-Y1 \-D- \-w\(1acaec1249ed0506a2cd2c0b231476f4\ .\ \%\{size_header\}\) \-d\{\"model\"\:\"claude-3-7-sonnet-20250219\"\,\"stream\"\:true\,\"max_tokens\"\:64000\,\"messages\"\:\[\{\"role\"\:\"user\"\,\"content\"\:\"I\'m\ confused\ about\ MCP\ vs\ tool\ use\ for\ llms\;\ once\ you\ have\ tool\ use\,\ why\ do\ you\ need\ MCP\?\"\}\]\,\"temperature\"\:1.0\,\"tools\"\:\[\{\"name\"\:\"query_user\"\,\"description\"\:\"Ask\ the\ user\ a\ question.\ E.g.\ ask\ them\ for\ clarification\,\ ask\ them\ to\ make\ a\ decision\,\ ask\ for\ their\ judgment\,\ ask\ them\ to\ be\ more\ specific.\ Returns\ the\ user\'s\ response.\"\,\"input_schema\"\:\{\"type\"\:\"object\"\,\"properties\"\:\{\"user_question\"\:\{\"type\"\:\"string\"\,\"description\"\:\"The\ message\ to\ display\ to\ the\ user.\"\}\}\,\"required\"\:\[\"user_question\"\]\}\}\]\,\"thinking\"\:\{\"type\"\:\"enabled\"\,\"budget_tokens\"\:54000\}\} \-HContent-Type\:\ application/json \-Hx-api-key\:\ APIKEY \-Hanthropic-version\:\ 2023-06-01 \-Hanthropic-beta\:\ pdfs-2024-09-25 \-Hanthropic-beta\:\ output-128k-2025-02-19 \-Hanthropic-beta\:\ prompt-caching-2024-07-31 \https\://api.anthropic.com/v1/messages
{
"gptel": "response headers",
"timestamp": "2025-03-25 13:50:08"
}
"HTTP/2 200 \r\ndate: Tue, 25 Mar 2025 18:50:01 GMT\r\ncontent-type: text/event-stream; charset=utf-8\r\ncache-control: no-cache\r\nanthropic-ratelimit-requests-limit: 1000\r\nanthropic-ratelimit-requests-remaining: 999\r\nanthropic-ratelimit-requests-reset: 2025-03-25T18:50:00Z\r\nanthropic-ratelimit-input-tokens-limit: 40000\r\nanthropic-ratelimit-input-tokens-remaining: 40000\r\nanthropic-ratelimit-input-tokens-reset: 2025-03-25T18:50:01Z\r\nanthropic-ratelimit-output-tokens-limit: 16000\r\nanthropic-ratelimit-output-tokens-remaining: 5000\r\nanthropic-ratelimit-output-tokens-reset: 2025-03-25T18:50:40Z\r\nanthropic-ratelimit-tokens-limit: 56000\r\nanthropic-ratelimit-tokens-remaining: 45000\r\nanthropic-ratelimit-tokens-reset: 2025-03-25T18:50:01Z\r\nrequest-id: req_01HjNJNGM1hTkHW2NsXtt3xL\r\nanthropic-organization-id: 5fbc051b-0b58-4e61-85c4-e3f45401d12b\r\nvia: 1.1 google\r\ncf-cache-status: DYNAMIC\r\nx-robots-tag: none\r\nserver: cloudflare\r\ncf-ray: 92609769897a29c0-ORD\r\n\r"
{
"gptel": "response body",
"timestamp": "2025-03-25 13:50:08"
}
event: message_startdata: {"type":"message_start","message":{"id":"msg_017xsuxWHMs9akyVRKNoFroH","type":"message","role":"assistant","model":"claude-3-7-sonnet-20250219","content":[],"stop_reason":null,"stop_sequence":null,"usage":{"input_tokens":479,"cache_creation_input_tokens":0,"cache_read_input_tokens":0,"output_tokens":5}} }event: content_block_startdata: {"type":"content_block_start","index":0,"content_block":{"type":"thinking","thinking":"","signature":""} }event: pingdata: {"type": "ping"}event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":"The user is asking"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" about the difference between Multiple Choice Prompting"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" (MCP) and tool use for large"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" language models (LLMs). They"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" specifically want to understand why MCP woul"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":"d still be needed if tool use is available."} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":"\n\nLet me think about this:\n\n1"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":". Multiple Choice Prompting ("} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":"MCP) is a technique where you provide"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" the LLM with specific answer choices to"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" select from, constraining its response to those"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" choices. This helps in getting more reliable"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" and controllable outputs from the model.\n\n2. Tool"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" use (or tool calling) refers"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" to the ability of LLMs to invoke"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" external functions/tools to perform actions outside"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" their context, like fetching information,"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" performing calculations, or taking actions in other"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" systems.\n\nThe user is essentially asking:"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" if an LLM can use tools, why woul"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":"d we still need the multiple-choice format"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" to constrain its answers?\n\nThis"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" is a good question that involves understanding the different"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":" purposes and strengths of these approaches. I shoul"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":"d ask the user some clarifying questions to better understan"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":"d their context and provide a more tail"} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"thinking_delta","thinking":"ored response."} }event: content_block_deltadata: {"type":"content_block_delta","index":0,"delta":{"type":"signature_delta","signature":"ErUBCkYIARgCIkAHzNP5VPHaD9lNZUIAVVX0PwVadsMdBrtHL/ekr+9hhXXsWnnhjY1Gz5ULsbfBRBJHAKTRsVhHfxHD9/C4ZmuGEgwAygESMsC6P0WAo34aDBL9bFKUZ4MP4ayWHSIw/zx7ZJo+oXxDC90SCK1YHbdLjNOBH+umzt05kE2F8gX9doeKh+Lglq9f3HeLaBoBKh1R4qNiZTmW/9H1WNlQ18Eg4SpJVsRDKEzlg4cwDg=="} }event: content_block_stopdata: {"type":"content_block_stop","index":0 }event: content_block_startdata: {"type":"content_block_start","index":1,"content_block":{"type":"text","text":""} }event: content_block_deltadata: {"type":"content_block_delta","index":1,"delta":{"type":"text_delta","text":"I'd like to clarify this distinction"}}event: content_block_deltadata: {"type":"content_block_delta","index":1,"delta":{"type":"text_delta","text":" for you, but I want to make"} }event: content_block_deltadata: {"type":"content_block_delta","index":1,"delta":{"type":"text_delta","text":" sure I understand your question correctly first."} }event: content_block_stopdata: {"type":"content_block_stop","index":1 }event: content_block_startdata: {"type":"content_block_start","index":2,"content_block":{"type":"tool_use","id":"toolu_01UhV5vA8J4BU73zqW4uLDM8","name":"query_user","input":{}} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":""} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"{\"user_que"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"stion\": "} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"\"When you"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":" refer "} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"to MCP,"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":" do you"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":" mean"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":" Multip"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"le Choice P"}}event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"rompt"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ing where L"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"LMs are gi"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ven "} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"spe"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"cific"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":" answer "} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"option"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"s to choose"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":" f"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"rom? And to"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":" co"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"nfirm, by to"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ol use, "} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"yo"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"u're refer"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ri"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ng "} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"to the a"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"bilit"}}event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"y of LL"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"Ms to "} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"call extern"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"al"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":" fun"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ct"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ions li"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ke search"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ing the web"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":", accessi"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"ng databas"} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"es, "} }event: content_block_deltadata: {"type":"content_block_delta","index":2,"delta":{"type":"input_json_delta","partial_json":"etc.?\"}"} }event: content_block_stopdata: {"type":"content_block_stop","index":2 }event: message_deltadata: {"type":"message_delta","delta":{"stop_reason":"tool_use","stop_sequence":null},"usage":{"output_tokens":374} }event: message_stopdata: {"type":"message_stop" }
{
"gptel": "request headers",
"timestamp": "2025-03-25 13:50:30"
}
{
"Content-Type": "application/json",
"x-api-key": "APIKEY",
"anthropic-version": "2023-06-01",
"anthropic-beta": "pdfs-2024-09-25"
}
{
"gptel": "request body",
"timestamp": "2025-03-25 13:50:30"
}
{
"model": "claude-3-7-sonnet-20250219",
"stream": true,
"max_tokens": 64000,
"messages": [
{
"role": "user",
"content": "I'm confused about MCP vs tool use for llms; once you have tool use, why do you need MCP?"
},
{
"role": "assistant",
"content": [
{
"type": "text",
"text": "I'd like to clarify this distinction for you, but I want to make sure I understand your question correctly first."
},
{
"type": "tool_use",
"id": "toolu_01UhV5vA8J4BU73zqW4uLDM8",
"name": "query_user",
"input": {
"user_question": "When you refer to MCP, do you mean Multiple Choice Prompting where LLMs are given specific answer options to choose from? And to confirm, by tool use, you're referring to the ability of LLMs to call external functions like searching the web, accessing databases, etc.?"
}
}
]
},
{
"role": "user",
"content": [
{
"type": "tool_result",
"tool_use_id": "toolu_01UhV5vA8J4BU73zqW4uLDM8",
"content": " Model Context Protocol"
}
]
}
],
"temperature": 1.0,
"tools": [
{
"name": "query_user",
"description": "Ask the user a question. E.g. ask them for clarification, ask them to make a decision, ask for their judgment, ask them to be more specific. Returns the user's response.",
"input_schema": {
"type": "object",
"properties": {
"user_question": {
"type": "string",
"description": "The message to display to the user."
}
},
"required": [
"user_question"
]
}
}
],
"thinking": {
"type": "enabled",
"budget_tokens": 54000
}
}
{"gptel": "request Curl command", "timestamp": "2025-03-25 13:50:30"}
curl \--disable \--location \--silent \--compressed \-XPOST \-y300 \-Y1 \-D- \-w\(7fccde0c202a3b602b67249ba99706f2\ .\ \%\{size_header\}\) \
-d\{\"model\"\:\"claude-3-7-sonnet-20250219\"\,\"stream\"\:true\,\"max_tokens\"\:64000\,\"messages\"\:\[\{\"role\"\:\"user\"\,\"content\"\:\"I\'m\ confused\ about\ MCP\ vs\ tool\ use\ for\ llms\;\ once\ you\ have\ tool\ use\,\ why\ do\ you\ need\ MCP\?\"\}\,\{\"role\"\:\"assistant\"\,\"content\"\:\[\{\"type\"\:\"text\"\,\"text\"\:\"I\'d\ like\ to\ clarify\ this\ distinction\ for\ you\,\ but\ I\ want\ to\ make\ sure\ I\ understand\ your\ question\ correctly\ first.\"\}\,\{\"type\"\:\"tool_use\"\,\"id\"\:\"toolu_01UhV5vA8J4BU73zqW4uLDM8\"\,\"name\"\:\"query_user\"\,\"input\"\:\{\"user_question\"\:\"When\ you\ refer\ to\ MCP\,\ do\ you\ mean\ Multiple\ Choice\ Prompting\ where\ LLMs\ are\ given\ specific\ answer\ options\ to\ choose\ from\?\ And\ to\ confirm\,\ by\ tool\ use\,\ you\'re\ referring\ to\ the\ ability\ of\ LLMs\ to\ call\ external\ functions\ like\ searching\ the\ web\,\ accessing\ databases\,\ etc.\?\"\}\}\]\}\,\{\"role\"\:\"user\"\,\"content\"\:\[\{\"type\"\:\"tool_result\"\,\"tool_use_id\"\:\"toolu_01UhV5vA8J4BU73zqW4uLDM8\"\,\"content\"\:\"\ Model\ Context\ Protocol\"\}\]\}\]\,\"temperature\"\:1.0\,\"tools\"\:\[\{\"name\"\:\"query_user\"\,\"description\"\:\"Ask\ the\ user\ a\ question.\ E.g.\ ask\ them\ for\ clarification\,\ ask\ them\ to\ make\ a\ decision\,\ ask\ for\ their\ judgment\,\ ask\ them\ to\ be\ more\ specific.\ Returns\ the\ user\'s\ response.\"\,\"input_schema\"\:\{\"type\"\:\"object\"\,\"properties\"\:\{\"user_question\"\:\{\"type\"\:\"string\"\,\"description\"\:\"The\ message\ to\ display\ to\ the\ user.\"\}\}\,\"required\"\:\[\"user_question\"\]\}\}\]\,\"thinking\"\:\{\"type\"\:\"enabled\"\,\"budget_tokens\"\:54000\}\} \
-HContent-Type\:\ application/json \-Hx-api-key\:\ APIKEY \-Hanthropic-version\:\ 2023-06-01 \-Hanthropic-beta\:\ pdfs-2024-09-25 \-Hanthropic-beta\:\ output-128k-2025-02-19 \-Hanthropic-beta\:\ prompt-caching-2024-07-31 \https\://api.anthropic.com/v1/messages
{
"gptel": "response headers",
"timestamp": "2025-03-25 13:50:30"
}
"HTTP/2 400 \r\ndate: Tue, 25 Mar 2025 18:50:30 GMT\r\ncontent-type: application/json\r\ncontent-length: 533\r\nx-should-retry: false\r\nrequest-id: req_01MNarzv5hDNkgzGRoezXTaV\r\nanthropic-organization-id: 5fbc051b-0b58-4e61-85c4-e3f45401d12b\r\nvia: 1.1 google\r\ncf-cache-status: DYNAMIC\r\nx-robots-tag: none\r\nserver: cloudflare\r\ncf-ray: 92609823588b8b98-ORD\r\n\r"
{
"gptel": "response body",
"timestamp": "2025-03-25 13:50:30"
}
{
"type": "error",
"error": {
"type": "invalid_request_error",
"message": "messages.1.content.0.type: Expected `thinking` or `redacted_thinking`, but found `text`. When `thinking` is enabled, a final `assistant` message must start with a thinking block (preceeding the lastmost set of `tool_use` and `tool_result` blocks). We recommend you include thinking blocks from previous turns. To avoid this requirement, disable `thinking`. Please consult our documentation at https://docs.anthropic.com/en/docs/build-with-claude/extended-thinking"
}
}
The text was updated successfully, but these errors were encountered:
* gptel-anthropic.el (gptel-curl--parse-stream): When tool-use is
enabled, we preserve the messages array until the back-and-forth
exchange is complete. The Anthropic API requires the last
assistant message in this growing messages array (sent back and
forth) to contain a thinking section when thinking and tools are
both enabled. Ensure that we pack the thinking block correctly
into the messages array (with a signature field) when using both
thinking and tools. (#743)
This fix is not required for the non-streaming case since the
thinking block is already supplied in the form required to send
back to the model.
Thanks! I updated and the streaming case is fixed.
Testing with (setq gptel-stream nil) uncovered a bug that also exists on the previous commit (already existed on dfd8879, I didn't check further back than that): the reasoning block is reprinted after the tool use.
Screenshot of the duplicated reasoning block, with tool results set to always include just to make it easier to follow:
Please update gptel first -- errors are often fixed by the time they're reported.
Bug Description
When claude 3.7 with thinking attempts a tool use after thinking, we recieve an error from the api:
Relates to #514
Backend
Anthropic
Steps to Reproduce
Given this setup:
Start a gptel chat session, choose the
Claude-thinking
model, enable tools and choose thequery_user
tool, set system message to none, and ask the llm:On a succesful reproduction the model will call the tool, and you will have to type in a response in the minibuffer; then, you will recieve the error quoted above.
Additional Context
.
Backtrace
Log Information
The text was updated successfully, but these errors were encountered: