A tutorial on how to build Python programs with LLMs enhanced in the mix.
Made with ❤️ by Eric J. Ma (@ericmjl).
This tutorial requires several components to be installed:
Pixi is a package management tool that we'll use to manage dependencies. Install it using one of these methods:
Linux & macOS (using curl):
curl -fsSL https://pixi.sh/install.sh | bashWindows (using PowerShell):
iwr -useb https://pixi.sh/install.ps1 | iexAfter installation, you may need to restart your terminal for the changes to take effect.
Ollama is required to run the local LLM models used in this tutorial.
- Linux: Visit https://ollama.com/download/linux
- macOS: Visit https://ollama.com/download/mac
- Windows: Visit https://ollama.com/download/windows
Follow the installation instructions for your platform.
After installing Ollama, run the following command to:
- Pull the required LLM models
- Install uv (Python package manager)
- Set up a virtual environment
- Install all required dependencies
pixi run startThe command will:
- Pull the following Ollama models:
llama3.2phi4gemma2:2b
- Install uv if not already installed
- Create a Python virtual environment
- Install all required dependencies
Please do this before arrival to the tutorial session, as they may take some time to download!
After installation is complete, you can run the notebooks using:
# Run the first notebook
uvx marimo edit --sandbox notebooks/01_simple_bot.py
# Or run the second notebook
uvx marimo edit --sandbox notebooks/02_structured_bot.pyIf you prefer to install components manually:
-
Install Ollama from https://ollama.com/download
-
Pull the required models:
ollama pull llama3.2 ollama pull phi4 ollama pull gemma2:2b
-
Install uv:
curl -fsSL https://astral.sh/uv/install.sh | bashThe installation guide is here, and includes Windows installation instructions too: https://docs.astral.sh/uv/getting-started/installation/
-
Run a notebook:
cd notebooks/ # super duper important! uvx marimo edit --sandbox 01_simple_bot.py # or any other notebook
- Ollama Model Download Issues: If you encounter issues downloading models, ensure you have a stable internet connection and sufficient disk space.
- uv Installation Problems: If uv installation fails, you can try installing it using pip:
pip install uv - Notebook Errors: Ensure all dependencies are correctly installed and that Ollama is running in the background.