This project provides a AgroMetLLM: web-based agrometeorological advisory tool that combines weather data from Open-Meteo, advanced evapotranspiration (ET) calculations, and localized, farmer-friendly advice generated by local large language models (LLMs) running on a Raspberry Pi (or any compatible Linux host) via Ollama on Raspberry Pi 4B an edge-IoT device. Gradio powers the user interface, making it easy for rural extension agents, scientists, and growers to obtain actionable, context-aware advice for their fields.
-
Location Search & Selection: Search by district/city/village name (global coverage via Open-Meteo Geocoding API).
-
Multi-Model LLM Integration: Switch between multiple local LLMs (e.g., TinyLlama, Qwen, etc.) hosted via Ollama; models are managed in memory for optimal performance.
-
Custom Forecast Window: Choose a forecast period from 3 to 7 days.
-
ET Model Comparison: Calculates and visualizes ET using multiple scientific methods: FAO ET₀, Hargreaves-Samani, Turc, Priestley-Taylor, and Makkink.
-
Intuitive, Rural-Friendly Advisory: Local LLM generates simple, pointwise farm advice with examples, covering:
- Irrigation plan
- Crop/soil action
- Livestock/labour management
- Pest & disease watch
- Input-saving tip
-
Data Logging: All results are logged in a local SQLite database for traceability and research.
-
Interactive Outputs:
- ET metrics table
- Comparative bar chart
- Model information
- Farmer advisory (LLM output)
-
Hardware:
- Raspberry Pi 4B (recommended) or compatible Linux system
- Sufficient RAM for LLMs (ideally >4GB for sub-1B models)
-
Software:
- Python 3.8+
- Ollama (for local LLM serving)
- Gradio
- pandas, numpy, matplotlib, requests, sqlite3
Install requirements:
requirements.txt:
requests
pandas
gradio
matplotlib
numpy
pip install -r requirements.txt# Ollama: follow [Ollama install docs](https://ollama.com/)-
Start Ollama and Load LLMs Make sure your desired LLMs are available in Ollama as per https://github.com/ollama/ollama
-
Run the Application
python agromet_advisory.py
The Gradio UI will start at: http://localhost:7860
-
How to Use:
- Select the LLM model (from dropdown)
- Enter your location name (city, village, district, etc.)
- Set forecast days (3–7)
- Click "Search Location" and select the desired location if multiple
- Click "Generate ET Forecast & Local LLM Advisory"
- Review the outputs: ET table, ET chart, model info, and actionable farm advice
- Add/Remove LLM Models: Pull new models via Ollama and restart the app.
- Change Advisory Prompt:
Edit the
ollama_advice()function to tune few-shot examples or language for your region. - Database:
The app stores all runs in
agromet_et.sqlite—analyze or export data as needed.
- If you see errors about models not loading, ensure Ollama is running and models are pulled.
- For slow responses, use smaller LLMs (<1B parameters) on Raspberry Pi.
- Weather or location not found? Check for typos or try a nearby city/village.
- Open-Meteo API for global weather and geocoding
- Ollama for local LLM inference
- Gradio for rapid web UI prototyping
Apache-2.0 License (see LICENSE for details)