A lightweight yet powerful web interface for Ollama and llama.cpp with markdown rendering, syntax highlighting, and intelligent conversation management.
Because why not? Having choices puts you in control.
- Ollama Support - Full integration with Ollama AI platform
- llama.cpp Support - Direct GGUF model support with high-performance inference
- Multiple Conversations - Create, manage, and rename chat sessions
- Persistent History - SQLite database storage with search functionality
- Multi-Backend Support - Automatic fallback between backends
- Model Selection - Choose from downloaded models across backends
- Lightweight - Minimal resource usage for local development
- Full Markdown rendering - GitHub-flavored syntax support
- Response metrics - Time, tokens, and speed tracking
Automatic installation (recommended):
curl -fsSL https://github.com/ukkit/chat-o-llama/raw/main/install.sh | bash
This will:
- Install Python and uv if missing
- Install Ollama if not present
- Download and set up Chat-O-Llama
- Start the service at http://localhost:3113
Manual installation:
git clone https://github.com/ukkit/chat-o-llama.git
cd chat-o-llama
# Using uv (recommended - faster and more reliable)
uv venv venv
source venv/bin/activate
uv sync
./chat-manager.sh start
Installing llama.cpp
curl -fsSL https://github.com/ukkit/chat-o-llama/raw/main/install-llamacpp.sh | bash
For detailed manual installation steps, see install.md
App Screenshots
First screen after installation
Available backends - Ollama and Llama.cpp
Quick switch between Ollama and Llama.cpp
Chat in llama.cpp with visible L indicator
- β Collapsible Sidebar with Llama Icon
- β Enhanced Chat Selection Identification
- β Disable Chatbox for Unavailable Models
- β Model Dropdown Validation Bug Fix
Common issues:
- Port in use? Run:
./chat-manager.sh start 3030
- No models? Install one:
ollama pull tinyllama
- Backend issues? Check status:
./chat-manager.sh backend status
β οΈ Documentation updates in progress
Document | Description |
---|---|
Installation Guide | Detailed installation instructions |
Features | Complete features overview |
Process Management | Using chat-manager.sh for service control |
Configuration | Configuration options and settings |
API Reference | REST API documentation |
Troubleshooting | Common issues and solutions |
Device | CPU | RAM | OS |
---|---|---|---|
Raspberry Pi 4 Model B Rev 1.4 | ARM Cortex-A72 | 8GB | Raspberry Pi OS |
Dell Optiplex 3070 | i3-9100T | 8GB | Debian 12 |
Nokia Purebook X14 | i5-10210U | 16 GB | Windows 11 Home |
Quiet a few known issues we are working on.
If you find this project helpful, consider:
- π Starring the repository on GitHub
- π€ Supporting development
- π Reporting bugs and suggesting features
MIT License - see LICENSE file for details.