This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.
- π Seamlessly connect to any MCP servers.
- π€ Use any LangChain-compatible LLM for flexible model selection.
- π¬ Interact via CLI, enabling dynamic conversations.
It leverages a utility function convert_mcp_to_langchain_tools()
. This function handles parallel initialization of specified multiple MCP servers and converts their available tools into a list of LangChain-compatible tools (List[BaseTool]).
The python version should be 3.11 or higher.
pip install langchain_mcp_client
Create a .env
file containing all the necessary API_KEYS
to access your LLM.
Configure the LLM, MCP servers, and prompt example in the llm_mcp_config.json5
file:
- LLM Configuration: Set up your LLM parameters.
- MCP Servers: Specify the MCP servers to connect to.
- Example Queries: Define example queries that invoke MCP server tools. Press Enter to use these example queries when prompted.
Below an example with a Jupyter MCP Server:
Check the llm_mcp_config.json5
configuration (commands depends if you are running on Linux or macOS/Windows).
# Start jupyterlab.
make jupyterlab
# Launch the CLI.
make cli
This is a prompt example.
create matplolib examples with many variants in jupyter
This initial code of this repo is taken from hideya/mcp-client-langchain-py (MIT License) and from langchain_mcp_tools
(MIT License).