REALM is a personal knowledge management system that connects your Personal notes to a local vector store and lets you query your notes using a Large Language Model (LLM). It supports automatic syncing, embedding with Chroma DB, and querying with Gemini via LangChain.
- Syncs only updated or new Markdown files from your Personal notes
- Stores embeddings locally using ChromaDB
- Queries your notes using Gemini (Google Generative AI)
- Command-line interface for seamless access
myBrain/
β
βββ embeddings/ # Stores Chroma vector database and metadata
βββ scripts/
β βββ create_embeddings.py # Sync and embed vault notes
β βββ langchain_integration.py # LLM querying logic
βββ .env # Stores environment variables (DO NOT COMMIT)
βββ brain.sh # CLI entrypoint script
βββ requirements.txt # Python dependencies
βββ README.md
git clone https://github.com/yourusername/myBrain.git
cd myBrainpython -m venv .venv
source .venv/bin/activatepip install -r requirements.txtVAULT_PATH=/absolute/path/to/your/obsidian/vault
GEMINI_API_KEY=your_google_gemini_api_keyNow you can run the script anytime
./brain.shThis will:
- Activate the virtual environment
- Sync and embed updated notes from your vault
- Start the LLM-powered QA system
- Prompt you to ask questions about your knowledge base
- Type
exitto quit