MVP of an idea using multiple local LLM models to simulate and play D&D
-
Updated
Apr 23, 2025 - Python
MVP of an idea using multiple local LLM models to simulate and play D&D
Web app for teams of 20+ members. In-built connections to major LLMs via API. Share chats, prompts, and agents in team or private folders. Modern, fully responsive stack (Next.js, Node.js, Python). Deploy your own vibe-coded AI apps, agents, or workflows—or use ready-made solutions from the library.
ModelArena: A Competitive Environment for Multi-Agent Training
Agentic framework for dynamic function calling across latest LLMs (gpt-4o, gemini-2.0-flash, groq modes, and anthropic models). Converts Python functions into provider-specific schemas for autonomous tool use. Features unified API, JSON schema generation, and integrated tool execution handling.
Plug-and-play Human-In-The-Loop integration of agentic workflows
Continuity for conversations that matter. ChatSeed is your AI Chat Management tool!
LLMSymphony is a Windows application for incognito chats with leading AI models using your own API keys
🐙 Meta-AI Orchestrator unifies multiple LLMs with dynamic routing, RAG search, and DAG pipelines for enterprise AI workloads across providers, with observability and QA.
Repo for the COLING 2025 paper "What's Wrong? Refining Meeting Summaries with LLM Feedback"
A unified interface to query multiple LLM providers (OpenAI, Anthropic, Mistral, Gemini) with built-in support for prompt benchmarking, cost estimation, and latency tracking.
Taza Chat : Enhanced ChatGPT Clone
PersonalAgent is a priority-based calendar manager with AI automation and conflict handling across Family, Personal, and Work calendars 🐙. Supports GPT-4.
what-to-cook
Priority-based calendar management with AI-powered task automation. Family events take priority, work adapts to life.
multi-llm-agent
🤖 Enterprise-grade AI orchestration platform with multi-LLM support, RAG capabilities, and 99.99% accuracy guarantee
Add a description, image, and links to the multi-llm topic page so that developers can more easily learn about it.
To associate your repository with the multi-llm topic, visit your repo's landing page and select "manage topics."