This project provides a Streamlit-based web interface to interact with public Cloudflare MCP (Model-Controlled Proxy) servers. It allows users to easily query Cloudflare's documentation, get internet traffic insights from Cloudflare Radar, and perform browser actions like fetching web pages or taking screenshots, all powered by CAMEL AI.
- Interactive UI: A user-friendly interface built with Streamlit.
- Tabbed Navigation: Separate tabs for different Cloudflare services:
- 📚 Documentation Server: Ask questions and get information from Cloudflare's official documentation.
- 📊 Radar Server: Access insights on global internet traffic trends, URL analysis, DNS analytics, and HTTP protocol analysis.
- 🌐 Browser Server: Fetch web page content, take screenshots of URLs, or convert web pages to Markdown.
- Powered by CAMEL AI: Leverages the CAMEL AI library to interact with the MCP servers and process language queries.
- Easy Configuration: Uses a
mcp_config.jsonfile to define accessible MCP servers.
- Python 3.8+
- Node.js and npm (for
npx mcp-remoteif yourmcp_config.jsonuses it) - A Gemini API Key (or another compatible model API key supported by CAMEL AI)
-
Clone the repository (if applicable):
# git clone <repository-url> # cd <repository-name>
-
Create and activate a virtual environment (recommended):
python -m venv venv source venv/bin/activate # On Windows use `venv\Scripts\activate`
-
Install dependencies:
pip install -r requirements.txt
If
mcp-remoteis used by yourmcp_config.json(as in the default configuration for this project), ensure it can be executed. You might need to install it globally or ensurenpxcan run it:npm install -g mcp-remote
-
Set up your API Key: Create a
.envfile in the root directory of the project and add your Gemini API key:GEMINI_API_KEY="YOUR_GEMINI_API_KEY_HERE"
-
Configure MCP Servers: The
mcp_config.jsonfile lists the MCP servers the application will connect to. The default configuration includes public Cloudflare servers:{ "servers": [ { "name": "cloudflare_docs", "url": "https://docs.mcp.cloudflare.com/sse" }, { "name": "cloudflare_radar", "url": "https://radar.mcp.cloudflare.com/sse" }, { "name": "cloudflare_browser", "url": "https://browser.mcp.cloudflare.com/sse" } ] }If your
mcp_config.jsonuses themcpServersblock withcommand: "npx"andargs: ["mcp-remote", "..."], ensurenpxandmcp-remoteare correctly installed and accessible in your system's PATH.
Once the setup is complete, run the Streamlit application using:
streamlit run app.pyThe application will open in your default web browser, usually at http://localhost:8501.
app.py: The main Streamlit application script.mcp_config.json: Configuration file for MCP servers.requirements.txt: Python dependencies..env: (You create this) Stores your API key.README.md: This file.
openai.InternalServerError: Error code: 500: This usually indicates a temporary issue with the backend LLM service (e.g., Gemini). Try again after a few moments. If persistent, check the status of the LLM provider.AttributeError: 'ChatAgentResponse' object has no attribute 'content': Ensure yourapp.pycorrectly parses the response fromagent.step(). It should typically beresponse.msgs[0].content.mcp-remotenot found: If yourmcp_config.jsonusesmcp-remote, ensure it's installed (npm install -g mcp-remote) andnpxis in your system's PATH.
Feel free to open issues or submit pull requests if you have suggestions for improvements or bug fixes.