AIMessageChunk init_tool_calls failed, when the chunk args is None #30563
Labels
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
Ɑ: core
Related to langchain-core
investigate
Flagged for investigation.
Checked other resources
Example Code
I'm locally deploying the qwq-32B model using vllm. When running the langgraph_supervisor example, I encountered a problem. When I use the
stream
(strem_mode="messages"
) function to output content, the tool call fails. After tracing the code, I found that the tool_call arguments returned by the LLM is None. However, in theinit_tool_calls
function of the AIMessageChunk class inlangchain_core/messages/ai.py
, there is no check for the case where args is equal to None, which leads to an error when callingparse_partial_json
.Error Message and Stack Trace (if applicable)
My test example code:
code run output:
We can see that the function arguments in the above output is None.
The code in
langchain_core/messages/ai.py
:When
chunk["args"]
is None, theparse_partial_json
function raises an exception, causing the call flow to be interrupted.Description
I'm locally deploying the qwq-32B model using vllm. When running the langgraph_supervisor example, I encountered a problem. When I use the
stream
(strem_mode="messages"
) function to output content, the tool call fails. After tracing the code, I found that the tool_call arguments returned by the LLM is None. However, in theinit_tool_calls
function of the AIMessageChunk class inlangchain_core/messages/ai.py
, there is no check for the case where args is equal to None, which leads to an error when callingparse_partial_json
.System Info
dependencies = [
"beautifulsoup4>=4.13.3",
"langchain-community>=0.3.19",
"langchain-weaviate>=0.0.4",
"langchain[openai]>=0.3.20",
"langgraph>=0.3.15",
"langgraph-supervisor>=0.0.14",
"pypdf>=5.4.0",
"weaviate-client>=4.11.1",
"xinference-client>=1.3.1.post1",
]
The text was updated successfully, but these errors were encountered: