A self-hosted OSINT tool that maps publicly indexed IP cameras on an interactive world map β with optional AI-powered scene analysis and intelligence briefs.
| πΊοΈ Interactive map | Every camera plotted as a pin on an OpenStreetMap dark tile layer |
| π Device intelligence panel | IP, geolocation, coordinates, timezone, manufacturer, stream URL β one click away |
| π‘ Live stream preview | MJPEG proxied through your server β the camera host never sees your IP |
| β‘ Real-time progress | Scrape updates streamed live over SSE as each country completes |
| π Search & filter | Filter the index by city, country, or manufacturer instantly |
| π€ AI enrichment | Scene analysis (vision model) + OSINT intelligence briefs (text model) β opt-in, user-triggered |
| π Bulk analysis queue | Queue hundreds of cameras, monitor a live system log, start/stop workers |
| π± Mobile responsive | Full tabbed layout at < 1100 px β map, list, and detail as separate tabs |
| π Plugin sources | Add a new data source by dropping one Python file into backend/sources/ |
| π³ One-command deploy | docker compose up --build and you're running |
git clone https://github.com/Dborasik/ARGUS.git
cd argus
cp .env.example .env
docker compose up --buildOpen http://localhost:5173 β that's it.
Backend
cd backend
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt
uvicorn main:app --reload --port 8000Frontend (separate terminal)
cd frontend
npm install
npm run devThe Vite dev server proxies /api to the backend automatically.
ARGUS fetches camera data through a plugin-style sources system. Each source is a self-contained Python module in backend/sources/ β no other code needs to change when you add or remove one.
| Source | ID | Credentials | Description |
|---|---|---|---|
| Insecam | insecam |
None | Publicly listed cameras on insecam.org |
- Copy
backend/sources/_template.pyβbackend/sources/<name>.py - Fill in the metadata and implement
scrape() - Restart the backend β your source appears in the boot screen automatically
See backend/sources/_template.py for the full documented contract.
AI is off by default. No API key needed, no cost, no UI surface until you opt in.
Set in .env:
AI_ENABLED=true
OPENAI_API_KEY=sk-...Any OpenAI-compatible endpoint works β OpenRouter, Ollama, LiteLLM, etc.:
OPENAI_BASE_URL=http://localhost:11434/v1 # Ollama exampleOnce enabled, three features appear:
- π¬ ANALYZE FEED β sends a stream frame to a vision model, stores a scene description on the camera record
- π INTELLIGENCE BRIEF β streams an OSINT assessment using all available metadata
- βοΈ BULK ANALYZE β queue many cameras, run with configurable concurrent workers, watch a live log
All settings are environment variables. Copy .env.example to .env and adjust β the backend reads from there automatically.
| Variable | Default | Description |
|---|---|---|
DATABASE_URL |
sqlite:///./argus.db |
SQLite (default) or PostgreSQL |
API_PORT |
8000 |
Backend listen port |
CORS_ORIGINS |
["http://localhost:5173"] |
Allowed frontend origins |
| Variable | Default | Description |
|---|---|---|
SCRAPER_WORKERS |
10 |
Concurrent HTTP workers per country |
SCRAPER_REQUEST_TIMEOUT |
15 |
Per-request timeout (seconds) |
SCRAPER_PAGE_DELAY |
0.2 |
Delay between country batches (seconds) |
SCRAPER_RETRY_COUNT |
3 |
Retries on transient failures |
SCRAPER_RETRY_BACKOFF |
0.5 |
Exponential back-off factor |
| Variable | Default | Description |
|---|---|---|
AI_ENABLED |
false |
Master switch |
OPENAI_API_KEY |
"" |
Required when AI is enabled |
OPENAI_BASE_URL |
https://api.openai.com/v1 |
Override for local/alternate providers |
AI_SCENE_MODEL |
gpt-4o-mini |
Vision model for scene analysis |
AI_INTEL_MODEL |
gpt-4.1-mini |
Text model for intelligence briefs |
AI_QUEUE_WORKERS |
3 |
Concurrent bulk analysis workers |
The default is SQLite β zero config. To switch to PostgreSQL:
DATABASE_URL=postgresql+psycopg2://argus:argus@localhost:5432/argusInstall the driver: pip install -r backend/requirements-postgres.txt
Or with Docker: docker compose --profile postgres up --build
Browser (React 18 + Vite + TypeScript + Leaflet)
|
+-- BootOverlay source selection on first load
+-- TopBar scrape controls, status, bulk AI
+-- CameraList left panel β filterable camera index
+-- MapView centre β OSM dark tile map
+-- DetailPanel right panel β device intelligence
|
| HTTP + SSE (/api/*)
|
Backend (Python 3.12 + FastAPI)
|
+-- GET /api/sources available source plugins
+-- POST /api/scrape/start kick off background scrape
+-- GET /api/scrape/progress SSE stream of scrape events
+-- GET /api/cameras full camera index
+-- GET /api/cameras/{id} single camera record
+-- GET /api/stream/proxy MJPEG proxy (hides client IP)
+-- POST /api/cameras/{id}/ai/* scene analysis + intel brief
+-- GET /api/ai/queue/* bulk analysis queue + SSE
|
+-- Sources autodiscovered plugins in backend/sources/
+-- Store CameraStore -- in-memory + SQLite/PostgreSQL
π Full project layout
argus/
βββ backend/
β βββ config.py All tuneable settings (pydantic-settings)
β βββ main.py uvicorn entry point
β βββ models/ camera.py, ai.py (Pydantic models)
β βββ utils/http.py Generic HTTP session factory with retry
β βββ db/ SQLAlchemy ORM, schema init, session factory
β βββ ai/ Optional AI layer (disabled by default)
β β βββ client.py Lazy AsyncOpenAI singleton
β β βββ prompts.py All system prompts as module-level constants
β β βββ service.py AIService β analyze_scene(), brief_camera()
β β βββ queue.py AnalysisQueue β all scene analysis jobs
β βββ sources/ Source plugins (autodiscovered)
β β βββ _template.py Documented plugin contract
β β βββ insecam/ Insecam source package
β βββ api/
β βββ app.py FastAPI factory + CORS + lifespan
β βββ routes.py All HTTP endpoints (including AI routes)
β βββ store.py Thread-safe CameraStore
βββ frontend/src/
β βββ types/ TypeScript mirrors of backend models
β βββ utils/api.ts All fetch/SSE calls
β βββ hooks/ useScrape, useAI, useAnalysisQueue, ...
β βββ components/ BootOverlay, TopBar, CameraList, MapView,
β DetailPanel, BulkAnalysisModal, mobile/
βββ tests/
βββ unit/ test_parse.py, test_store.py, test_ai_*
βββ integration/ test_routes.py
# Run all tests
python -m pytest tests/ -v
# Frontend type-check + build
cd frontend && npm run buildContributions are welcome. Please open an issue first for non-trivial changes.
See CONTRIBUTING.md for setup, code conventions, and the PR checklist. See AGENTS.md for AI coding agent guidelines.
This tool indexes data that is already publicly listed on its sources. It does not bypass authentication, exploit vulnerabilities, or interact with camera devices beyond loading their publicly accessible stream URLs (only when the user explicitly enables the live preview toggle, which proxies through the server).
Use responsibly and in accordance with the laws of your jurisdiction.
ARGUS is licensed under the GNU General Public License v3.0.
