Build, test, and deploy AI agent workflows with a visual no-code interface
Open Agent Builder is a visual workflow builder for creating AI agent pipelines powered by Firecrawl. Design complex agent workflows with a drag-and-drop interface, then execute them with real-time streaming updates.
Perfect for:
- Web scraping and data extraction workflows
- Multi-step AI agent pipelines
- Automated research and content generation
- Data transformation and analysis
- Web automation with human-in-the-loop approvals
Note: This project is actively under development. Some features are still in progress and we welcome contributions and PRs!
- Drag-and-drop interface for building agent workflows
- Real-time execution with streaming updates
- 8 core node types: Start, Agent, MCP Tools, Transform, If/Else, While Loop, User Approval, End
- Template library with pre-built workflows
- MCP protocol support for extensible tool integration
- Native Firecrawl integration for web scraping and searching
- LangGraph execution engine for reliable state management
- Clerk authentication for secure multi-user access
- Convex database for persistent storage
- API endpoints for programmatic execution
- Human-in-the-loop approvals for sensitive operations
Technology | Purpose |
---|---|
Firecrawl | Web scraping API for converting websites into LLM-ready data |
Next.js 16 (canary) | React framework with App Router for frontend and API routes |
TypeScript | Type-safe development across the stack |
LangGraph | Workflow orchestration engine with state management, conditional routing, and human-in-the-loop support |
Convex | Real-time database with automatic reactivity for workflows, executions, and user data |
Clerk | Authentication and user management with JWT integration |
Tailwind CSS | Utility-first CSS framework for responsive UI |
React Flow | Visual workflow builder canvas with drag-and-drop nodes |
Anthropic | Claude AI integration with native MCP support (Claude Haiku 4.5 & Sonnet 4.5) |
OpenAI | gpt-5 integration (MCP support coming soon) |
Groq | Fast inference for open models (MCP support coming soon) |
E2B | Sandboxed code execution for secure transform nodes |
Vercel | Deployment platform with edge functions |
Before you begin, you'll need:
- Node.js 18+ installed on your machine
- Firecrawl API key (Required for web scraping) - Get one here
- Convex account - Sign up free
- Clerk account - Sign up free
Note: LLM API keys can be added directly in the UI via Settings β API Keys after setup. For MCP tool support, Anthropic Claude (Haiku 4.5 or Sonnet 4.5) is currently recommended as the default option.
git clone https://github.com/firecrawl/open-agent-builder.git
cd open-agent-builder
npm install
Convex handles all workflow and execution data persistence.
# Install Convex CLI globally
npm install -g convex
# Initialize Convex project
npx convex dev
This will:
- Open your browser to create/link a Convex project
- Generate a
NEXT_PUBLIC_CONVEX_URL
in your.env.local
- Start the Convex development server
Keep the Convex dev server running in a separate terminal.
Clerk provides secure user authentication and management.
- Go to clerk.com and create a new application
- In your Clerk dashboard:
- Go to API Keys
- Copy your keys
- Go to JWT Templates β Convex:
- Click "Apply"
- Copy the issuer URL
Add to your .env.local
:
# Clerk Authentication
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=pk_test_...
CLERK_SECRET_KEY=sk_test_...
# Clerk + Convex Integration
CLERK_JWT_ISSUER_DOMAIN=https://your-clerk-domain.clerk.accounts.dev
Edit convex/auth.config.ts
and update the domain:
export default {
providers: [
{
domain: "https://your-clerk-domain.clerk.accounts.dev", // Your Clerk issuer URL
applicationID: "convex",
},
],
};
Then push the auth config to Convex:
npx convex dev
Firecrawl is the core web scraping engine that powers most workflows.
- Get your API key at firecrawl.dev
- Add to
.env.local
:
# Firecrawl API (REQUIRED)
FIRECRAWL_API_KEY=fc-...
Note: Users can also add their own Firecrawl keys in Settings β API Keys, but having a default key in
.env.local
enables the template workflows.
While users can add their own LLM API keys through the UI (Settings β API Keys), you can optionally set a default provider in .env.local
:
# Optional: Choose one as default
# Anthropic Claude (Recommended - Native MCP support with Haiku 4.5 & Sonnet 4.5)
ANTHROPIC_API_KEY=sk-ant-...
# OpenAI GPT-5 (MCP support coming soon)
OPENAI_API_KEY=sk-...
# Groq (MCP support coming soon)
GROQ_API_KEY=gsk_...
Important: For workflows using MCP tools (like Firecrawl integration), Anthropic Claude is currently the recommended provider as it has native MCP support. OpenAI and Groq MCP support is coming soon.
For advanced transform nodes with sandboxed code execution:
# E2B Code Interpreter (Optional)
E2B_API_KEY=e2b_...
Get your key at e2b.dev
# Terminal 1: Convex dev server
npx convex dev
# Terminal 2: Next.js dev server
npm run dev
Or run both with one command:
npm run dev:all
Visit http://localhost:3000
npm run build
npm start
- Sign Up/Login at
http://localhost:3000
- Add your LLM API key in Settings β API Keys
- For MCP tool support: Use Anthropic Claude (Haiku 4.5 or Sonnet 4.5)
- For basic workflows: OpenAI or Groq also work
- Click "New Workflow" or select a template
- Try the "Simple Web Scraper" template:
- Pre-configured to scrape any website
- Uses Firecrawl for extraction
- AI agent summarizes the content
- Click "Run" and enter a URL
- Watch real-time execution with streaming updates
Node Type | Purpose | Example Use |
---|---|---|
Start | Workflow entry point | Define input variables |
Agent | AI reasoning with LLMs | Analyze data, make decisions |
MCP Tool | External tool calls | Firecrawl scraping, APIs |
Transform | Data manipulation | Parse JSON, filter arrays |
If/Else | Conditional logic | Route based on conditions |
While Loop | Iteration | Process multiple pages |
User Approval | Human-in-the-loop | Review before posting |
End | Workflow completion | Return final output |
Anthropic Claude - Full native MCP support
- Claude Sonnet 4.5 (Recommended)
- Claude Haiku 4.5
Anthropic's MCP implementation provides MCP support, other providers are currently in progress.
- OpenAI - MCP support in development
- Gemini - MCP support in development
- Open Router - coming soon...
MCP tools enable agents to interact with external services like Firecrawl:
- Add an Agent node to your workflow
- In the node settings, select MCP Tools
- Choose Firecrawl or add a custom MCP server
- The agent can now call Firecrawl tools like
scrape
,search
,crawl
Example workflow with MCP:
Start β Agent (with Firecrawl MCP) β End
The agent can intelligently decide when to scrape pages, search the web, or crawl sites based on your instructions.
What it does: Scrape any website and get an AI summary
Nodes: Start β Firecrawl Scrape β Agent Summary β End
Try it:
Input: https://firecrawl.dev
Output: "Firecrawl is a web scraping API that converts websites into LLM-ready markdown..."
What it does: Search web, scrape top results, synthesize findings
Nodes: Start β Firecrawl Search β Loop (Scrape Each) β Agent Synthesis β End
What it does: Research companies, extract structured data, generate report
Nodes: Start β Parse Companies β Loop (Research + Extract) β Approval β Export β End
Features used:
- Firecrawl web search
- Structured JSON extraction
- While loops for iteration
- Human approval gates
- Conditional routing
What it does: Track product prices across multiple sites
Nodes: Start β Loop (Scrape + Extract Price) β Compare β Notify β End
Users can add their own API keys via Settings β API Keys:
- LLM Providers: Anthropic (Recommended for MCP), OpenAI, Groq (Required - add at least one)
- Firecrawl: Personal API key (Optional - falls back to environment variable)
- Custom MCP Servers: Authentication tokens
This allows:
- Each user to use their own API quotas
- Fallback to environment variables if not set
- Easy key rotation and management
Add custom MCP servers in Settings β MCP Registry:
- Click "Add MCP Server"
- Enter server URL and authentication
- Test connection to discover available tools
- Use in Agent nodes by selecting from MCP tools dropdown
Supported MCP Servers:
- Firecrawl (built-in)
- Custom HTTP endpoints
- Environment variable substitution:
{API_KEY}
-
Push your code to GitHub
-
Deploy to Vercel:
# Install Vercel CLI npm i -g vercel # Deploy vercel
-
Set environment variables in Vercel dashboard:
NEXT_PUBLIC_CONVEX_URL
(from Convex)- Clerk keys
FIRECRAWL_API_KEY
(Required)- Optional: Default LLM provider keys
-
Deploy Convex to production:
npx convex deploy
-
Update Clerk settings:
- Add your Vercel domain to allowed origins
- Update redirect URLs
Required:
NEXT_PUBLIC_CONVEX_URL
- Convex databaseNEXT_PUBLIC_CLERK_PUBLISHABLE_KEY
- Clerk authCLERK_SECRET_KEY
- Clerk authCLERK_JWT_ISSUER_DOMAIN
- Clerk + Convex integrationFIRECRAWL_API_KEY
- Web scraping
Optional (can be added in UI instead):
ANTHROPIC_API_KEY
- Default Claude provider (Recommended for MCP)OPENAI_API_KEY
- Default gpt-5 provider (MCP coming soon)GROQ_API_KEY
- Default Groq provider (MCP coming soon)E2B_API_KEY
- Sandboxed code execution
Generate an API key in Settings β API Keys, then:
curl -X POST https://your-domain.com/api/workflows/my-workflow-id/execute-stream \
-H "Authorization: Bearer sk_live_..." \
-H "Content-Type: application/json" \
-d '{"url": "https://example.com"}'
Response: Server-Sent Events (SSE) stream with real-time updates
%%{init: {'flowchart': {'htmlLabels': false}} }%%
flowchart TD
subgraph Frontend
A["Next.js Frontend\n(React + Tailwind)"]
A_desc["Visual workflow builder, real-time execution display, user settings and API key management"]
A --> A_desc
end
subgraph Backend
B["API Routes (Next.js)"]
B_desc["Workflow execution, authentication middleware, streaming SSE responses"]
C["LangGraph Executor (Server-side only)"]
C_desc["StateGraph orchestration, conditional routing, loop handling, human-in-the-loop interrupts"]
B --> B_desc
C --> C_desc
A_desc --> B
B_desc --> C
end
subgraph Integrations
D1["Firecrawl API"]
D2["LLMs (Claude, gpt-5, Groq)"]
D3["MCP Servers"]
C_desc --> D1
C_desc --> D2
C_desc --> D3
end
D_common["Convex Database"]
D_common_desc["Workflows, executions, user settings, MCP configurations"]
D1 --> D_common
D2 --> D_common
D3 --> D_common
D_common --> D_common_desc
- MCP Support for OpenAI & Groq - Adding native MCP protocol support
- OAuth MCP Authentication - Support for OAuth-based MCP servers
- Additional MCP Integrations - More pre-built MCP server connections
- Workflow Sharing - Public template marketplace
- Scheduled Executions - Cron-based workflow triggers
- E2B Code Interpreter - Transform node sandboxing (requires E2B API key)
- Complex Loop Patterns - Nested loops and advanced iteration
- Custom Node Types - Plugin system for community nodes
- Full MCP support for all LLM providers
- OAuth authentication for MCP servers
We welcome contributions and PRs to help build these features!
This project is licensed under the MIT License
Star us on GitHub | Try Firecrawl
Made with love by the Firecrawl team