This repository contains the documentation website for Inference Gateway, an open-source API gateway for Large Language Models (LLMs) that provides a unified interface for accessing multiple AI providers.
Inference Gateway offers a unified API layer to interact with multiple LLM providers including OpenAI, DeepSeek, Anthropic, Groq, Cohere, Ollama and more. It provides a consistent interface for interacting with different LLMs, abstracting away the differences between each provider's API.
This documentation site is built with Next.js.
# Install dependencies
npm install
# Start development server
npm run dev
You can use the devcontainer for a consistent development environment. The devcontainer is configured with all the necessary tools and extensions for development.
Contributions to improve the documentation are welcome! You can:
- Edit existing MDX files in the markdown directory
- Add new documentation pages
- Improve the site's design and functionality
This project is licensed under the MIT License.