An AI-powered terminal chat assistant with persistent memory - it actually remembers you.
Memoire (French: memory) is a lightweight CLI chat app that connects Google's Gemini models with Mem0's long-term memory layer. Every conversation is stored and retrieved semantically, so the AI can reference what you told it last week without you having to repeat yourself.
- Streaming responses - text prints word-by-word as the model generates it, no waiting for a full response
- Persistent memory - powered by Mem0, past conversations are stored and injected as context automatically
- Graceful memory fallback - skip the
MEM0_API_KEYand the assistant works fine, just without cross-session memory - Minimal commands -
/help,/clear,/exit- stays out of the way - Environment-first config - everything lives in a
.envfile, no flags to memorize
You type a message
│
▼
Mem0 searches past memories for relevant context
│
▼
System prompt assembled: base persona + retrieved memories
│
▼
Gemini streams a response token-by-token → printed to terminal
│
▼
Full conversation turn stored back into Mem0
The USER_ID is set to "myself" by default in - change it to your name or a unique identifier in .env if you plan to run multiple users against the same Mem0 account.
| Layer | Library | Purpose |
|---|---|---|
| LLM inference | @ai-sdk/google + ai (Vercel AI SDK) |
Stream responses from Gemini models |
| Memory | mem0ai |
Store and semantically search conversation history |
| Config | dotenv |
Load environment variables from .env |
| UX | ora |
Terminal spinner while waiting for responses |
| Runtime | Node.js + TypeScript | - |
- Node.js 18+
- A Google AI Studio API key
- (Optional) A Mem0 API key for persistent memory
1. Clone the repo
git clone https://github.com/rrahul-1/Memoire-cli.git
cd Memoire-cli2. Install dependencies
npm install3. Create a .env file
cp .env.example .env # or create it manually.env contents:
# Required - the Gemini model to use
LLM_MODEL=gemini-2.5-flash-lite
# Required - your Google Generative AI key
GOOGLE_GENERATIVE_AI_API_KEY=your_google_api_key_here
# Optional - enables persistent memory across sessions
MEM0_API_KEY=your_mem0_api_key_here
# Optional - scopes your memories in Mem0 (defaults to "myself")
USER_ID=your_name| Variable | Required | Description |
|---|---|---|
LLM_MODEL |
Yes | Gemini model identifier, e.g. gemini-2.5-flash-lite, gemini-2.0-flash |
GOOGLE_GENERATIVE_AI_API_KEY |
Yes | API key from Google AI Studio |
MEM0_API_KEY |
No | API key from Mem0. If omitted, memory is disabled but the assistant still works. |
USER_ID |
No | Identifier used to scope memories in Mem0. Defaults to "myself". Set this if you share a Mem0 account across multiple people or projects. |
Missing LLM_MODEL causes the app to exit immediately with a clear error message listing which variables are absent.
Development (TypeScript, no build step)
npm run devProduction (build first, then run)
npm run build
npm startOnce started, you'll see:
Welcome to memoire-cli ✨
Your AI-powered memory + chat assistant.
Memory: ✓ enabled
What's on your mind today?
Use /help to see available commands.
You >
Type anything to start chatting. The AI's response streams directly into the terminal.
| Command | Description |
|---|---|
/help |
Show the command guide |
/clear |
Clear the terminal screen |
/memory |
Print all memories stored for your USER_ID |
/clear-memory |
Permanently wipe all stored memories for your USER_ID |
/exit |
Quit the application (also works with Ctrl+C) |
Unrecognized /commands display an error and remind you to use /help.
Note:
/memoryand/clear-memorydo nothing whenMEM0_API_KEYis not set.
memoire-cli/
├── src/
│ └── index.ts # Entire application - ~190 lines
├── dist/ # Compiled output (generated by `npm run build`)
├── .env # Your secrets (never commit this)
├── package.json
└── tsconfig.json
ISC


