Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

thinking_block field returned by Claude model ignored by ChatOpenAI().invoke() #30530

Open
5 tasks done
vigneshvs opened this issue Mar 27, 2025 · 1 comment
Open
5 tasks done
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@vigneshvs
Copy link

vigneshvs commented Mar 27, 2025

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import os
os.environ["OPENAI_BASE_URL"] = <locally_hosted_API_service_that_calls_bedrock_or_openai>
os.environ["OPENAI_API_KEY"] = <auth_key_for_service>

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
        temperature=1,
        max_tokens=20000,
        model="us.anthropic.claude-3-7-sonnet-20250219-v1:0", 
        extra_body={"thinking": {"type": "enabled","budget_tokens": 16000}}
)
messages = [
    ("user", "hello!"),
]
ai_msg = llm.invoke(messages)
print(ai_msg)

Error Message and Stack Trace (if applicable)

Response

content="Hello there! It's nice to meet you. I'm an AI assistant ready to help with information, answer questions, or just chat. How are you today, and is there something specific I can help you with?" additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 93, 'prompt_tokens': 37, 'total_tokens': 130, 'completion_tokens_details': None, 'prompt_tokens_details': {'audio_tokens': None, 'cached_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0}, 'model_name': 'us.anthropic.claude-3-7-sonnet-20250219-v1:0', 'system_fingerprint': None, 'id': '<some_uuid>', 'finish_reason': 'stop', 'logprobs': None} id='<some_uuid>' usage_metadata={'input_tokens': 37, 'output_tokens': 93, 'total_tokens': 130, 'input_token_details': {'cache_read': 0}, 'output_token_details': {}}

The following reasoning_content and thinking_blocks fields are not found in the response obtained via langchain -

"choices":[{"index":0,"message":{"content":"Hi there! Welcome! I'm an AI assistant and I'm happy to help with any questions, provide information, or just chat. What can I do for you today?","role":"assistant","tool_calls":null,"function_call":null,"reasoning_content":"Hello! This is a simple greeting from the user. I should respond in a friendly, welcoming manner that acknowledges their greeting and shows I'm ready to assist them with whatever they need.","thinking_blocks":[{"type":"thinking","thinking":"Hello! This is a simple greeting from the user. I should respond in a friendly, welcoming manner that acknowledges their greeting and shows I'm ready to assist them with whatever they need.","signature":"ErcBCkgIARABGAIiQBqFPuLSq3mX010ufxwlrJY5kdiRQwQF1VZ7izukmsv0ARO43wE/knHBrcL9YmBB4HPu4a9j6gVlcwINVAoFHccSDFV1vU70AIK3wRo00BoMRfVPi41wJw6xJNLAIjA3jG6QLT38KXp7Wp/lI8zJNUGhQkqEI8/eonaq6UMDuGMsnbsSjVyrv6iaiMljnAoqHfcMBcGK+c8ZGgA7gL6P1pyneTDBjh+BORWw2fRu"}]},"finish_reason":"stop"}]

Note: Response variation observed due to separate LLM calls

Description

We use ChatOpenAI() as the interface for interacting with a locally API service that is based on LiteLLM and calls OpenAI or Bedrock depending on the model name. With Anthropic Claude sonnet 3.7's extended thinking, the model provides reasoning_content and thinking_blocks in the response fields. However, this is swallowed by langchain and returns only the vanilla fields.

System Info

$ python -m langchain_core.sys_info



System Information
------------------
> OS:  Darwin
> OS Version:  Darwin Kernel Version 23.6.0: Thu Dec 19 20:44:43 PST 2024; root:xnu-10063.141.1.703.2~1/RELEASE_ARM64_T6020
> Python Version:  3.9.13 (main, Sep 27 2023, 15:09:55) 
[Clang 15.0.0 (clang-1500.0.40.1)]

Package Information
-------------------
> langchain_core: 0.3.48
> langchain: 0.3.21
> langsmith: 0.3.18
> langchain_aws: 0.2.17
> langchain_openai: 0.3.10
> langchain_text_splitters: 0.3.7

Optional packages not installed
-------------------------------
> langserve

Other Dependencies
------------------
> async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
> boto3: 1.37.21
> httpx: 0.28.1
> jsonpatch<2.0,>=1.33: Installed. No version info available.
> langchain-anthropic;: Installed. No version info available.
> langchain-aws;: Installed. No version info available.
> langchain-azure-ai;: Installed. No version info available.
> langchain-cohere;: Installed. No version info available.
> langchain-community;: Installed. No version info available.
> langchain-core<1.0.0,>=0.3.45: Installed. No version info available.
> langchain-core<1.0.0,>=0.3.48: Installed. No version info available.
> langchain-deepseek;: Installed. No version info available.
> langchain-fireworks;: Installed. No version info available.
> langchain-google-genai;: Installed. No version info available.
> langchain-google-vertexai;: Installed. No version info available.
> langchain-groq;: Installed. No version info available.
> langchain-huggingface;: Installed. No version info available.
> langchain-mistralai;: Installed. No version info available.
> langchain-ollama;: Installed. No version info available.
> langchain-openai;: Installed. No version info available.
> langchain-text-splitters<1.0.0,>=0.3.7: Installed. No version info available.
> langchain-together;: Installed. No version info available.
> langchain-xai;: Installed. No version info available.
> langsmith-pyo3: Installed. No version info available.
> langsmith<0.4,>=0.1.125: Installed. No version info available.
> langsmith<0.4,>=0.1.17: Installed. No version info available.
> numpy: 1.26.4
> openai-agents: Installed. No version info available.
> openai<2.0.0,>=1.68.2: Installed. No version info available.
> opentelemetry-api: Installed. No version info available.
> opentelemetry-exporter-otlp-proto-http: Installed. No version info available.
> opentelemetry-sdk: Installed. No version info available.
> orjson: 3.10.16
> packaging: 24.2
> packaging<25,>=23.2: Installed. No version info available.
> pydantic: 2.10.6
> pydantic<3.0.0,>=2.5.2;: Installed. No version info available.
> pydantic<3.0.0,>=2.7.4: Installed. No version info available.
> pydantic<3.0.0,>=2.7.4;: Installed. No version info available.
> pytest: Installed. No version info available.
> PyYAML>=5.3: Installed. No version info available.
> requests: 2.32.3
> requests-toolbelt: 1.0.0
> requests<3,>=2: Installed. No version info available.
> rich: Installed. No version info available.
> SQLAlchemy<3,>=1.4: Installed. No version info available.
> tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
> tiktoken<1,>=0.7: Installed. No version info available.
> typing-extensions>=4.7: Installed. No version info available.
> zstandard: 0.23.0
@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Mar 27, 2025
@andrasfe
Copy link
Contributor

andrasfe commented Apr 2, 2025

you should use ChatAnthropic instead of the OpenAI equivalent. The following works for me:

import os
from dotenv import load_dotenv

load_dotenv()

from langchain_anthropic import ChatAnthropic

llm = ChatAnthropic(
    model="claude-3-7-sonnet-latest", 
    max_tokens=5000,
    thinking={"type": "enabled", "budget_tokens": 2000}
)
messages = [
    ("user", "hello!"),
]
ai_msg = llm.invoke(messages)
print(ai_msg)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants