Skip to content

πŸ¦™ chat-o-llama: A lightweight, modern web interface for AI conversations with support for both Ollama and llama.cpp backends. Features persistent conversation management, real-time backend switching, intelligent context compression, and a clean responsive UI.

License

Notifications You must be signed in to change notification settings

ukkit/chat-o-llama

Repository files navigation

Chat-O-Llama πŸ¦™

⚑Best of Both Worlds: Now supporting Ollama AND Llama.cpp - switch backends on the fly!


A lightweight yet powerful web interface for Ollama and llama.cpp with markdown rendering, syntax highlighting, and intelligent conversation management.

Ollama Chat Interface Python Ollama Llama.cpp License Buy Me a Coffee

⁉️ Why Another App?

Because why not? Having choices puts you in control.

✨ Features

  • Ollama Support - Full integration with Ollama AI platform
  • llama.cpp Support - Direct GGUF model support with high-performance inference
  • Multiple Conversations - Create, manage, and rename chat sessions
  • Persistent History - SQLite database storage with search functionality
  • Multi-Backend Support - Automatic fallback between backends
  • Model Selection - Choose from downloaded models across backends
  • Lightweight - Minimal resource usage for local development
  • Full Markdown rendering - GitHub-flavored syntax support
  • Response metrics - Time, tokens, and speed tracking

πŸš€ Quick Start

Automatic installation (recommended):

curl -fsSL https://github.com/ukkit/chat-o-llama/raw/main/install.sh | bash

This will:

  • Install Python and uv if missing
  • Install Ollama if not present
  • Download and set up Chat-O-Llama
  • Start the service at http://localhost:3113

Manual installation:

git clone https://github.com/ukkit/chat-o-llama.git
cd chat-o-llama

# Using uv (recommended - faster and more reliable)
uv venv venv
source venv/bin/activate
uv sync

./chat-manager.sh start

Installing llama.cpp

curl -fsSL https://github.com/ukkit/chat-o-llama/raw/main/install-llamacpp.sh | bash

For detailed manual installation steps, see install.md

πŸ“Έ Screenshots

App Screenshots

chat-o-llama - First Screen First screen after installation

chat-o-llama - Switch between ollama and llama.cpp Available backends - Ollama and Llama.cpp

chat-o-llama - Quick switch Quick switch between Ollama and Llama.cpp

chat-o-llama - Chat in llama.cpp Chat in llama.cpp with visible L indicator

chat-o-llama - Chat in Ollam Chat in Ollama with visible O indicator

chat-o-llama - Quick Switch back Thinking styling

πŸ†• What's new in v2.1.0

  • βœ… Collapsible Sidebar with Llama Icon
  • βœ… Enhanced Chat Selection Identification
  • βœ… Disable Chatbox for Unavailable Models
  • βœ… Model Dropdown Validation Bug Fix

πŸ”§ Troubleshooting

Common issues:

  • Port in use? Run: ./chat-manager.sh start 3030
  • No models? Install one: ollama pull tinyllama
  • Backend issues? Check status: ./chat-manager.sh backend status

πŸ“š Documentation

⚠️ Documentation updates in progress
Document Description
Installation Guide Detailed installation instructions
Features Complete features overview
Process Management Using chat-manager.sh for service control
Configuration Configuration options and settings
API Reference REST API documentation
Troubleshooting Common issues and solutions

πŸ–₯ Tested Hardware

Device CPU RAM OS
Raspberry Pi 4 Model B Rev 1.4 ARM Cortex-A72 8GB Raspberry Pi OS
Dell Optiplex 3070 i3-9100T 8GB Debian 12
Nokia Purebook X14 i5-10210U 16 GB Windows 11 Home

πŸ› Known Issues

Quiet a few known issues we are working on.


If you find this project helpful, consider:

  • 🌟 Starring the repository on GitHub
  • 🀝 Supporting development
  • πŸ› Reporting bugs and suggesting features

License

MIT License - see LICENSE file for details.

About

πŸ¦™ chat-o-llama: A lightweight, modern web interface for AI conversations with support for both Ollama and llama.cpp backends. Features persistent conversation management, real-time backend switching, intelligent context compression, and a clean responsive UI.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published