Skip to content

datalayer/langchain-mcp-client

Repository files navigation

Datalayer

Become a Sponsor

🦜 πŸ”— LangChain MCP Client

Github Actions Status PyPI - Version

This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.

  • 🌐 Seamlessly connect to any MCP servers.
  • πŸ€– Use any LangChain-compatible LLM for flexible model selection.
  • πŸ’¬ Interact via CLI, enabling dynamic conversations.

Conversion to LangChain Tools

It leverages a utility function convert_mcp_to_langchain_tools(). This function handles parallel initialization of specified multiple MCP servers and converts their available tools into a list of LangChain-compatible tools (List[BaseTool]).

Installation

The python version should be 3.11 or higher.

pip install langchain_mcp_client

Configuration

Create a .env file containing all the necessary API_KEYS to access your LLM.

Configure the LLM, MCP servers, and prompt example in the llm_mcp_config.json5 file:

  1. LLM Configuration: Set up your LLM parameters.
  2. MCP Servers: Specify the MCP servers to connect to.
  3. Example Queries: Define example queries that invoke MCP server tools. Press Enter to use these example queries when prompted.

Usage

Below an example with a Jupyter MCP Server:

Check the llm_mcp_config.json5 configuration (commands depends if you are running on Linux or macOS/Windows).

# Start jupyterlab.
make jupyterlab
# Launch the CLI.
make cli

This is a prompt example.

create matplolib examples with many variants in jupyter

Credits

This initial code of this repo is taken from hideya/mcp-client-langchain-py (MIT License) and from langchain_mcp_tools (MIT License).