A multi-agent system for analyzing surveillance infrastructure using OpenStreetMap data. The system operates completely locally without external APIs and uses LangChain-based agents that create memories of their actions.
The pipeline consists of two main agents:
- Scraper Agent: Downloads surveillance data from OpenStreetMap via Overpass API
- Analyzer Agent: Enriches data using local LLM analysis and generates visualizations
Key Features:
- Local LLM processing (no external APIs)
- Persistent agent memory with SQLite
- Multiple analysis scenarios (basic, full, quick, report, mapping)
- Rich CLI interface with progress tracking
- Automatic caching to avoid re-downloading data
- Comprehensive visualizations (heatmaps, hotspots, charts)
- Python 3.11
uvpackage manager
- For macOS
-
Use HomeBrew package manager. Install HomeBrew following these instructions.
brew install [email protected]
-
- For Ubuntu
-
You can utilize the Deadsnakes PPA.
sudo add-apt-repository ppa:deadsnakes/ppa -
Update the package list.
sudo apt update -
Install Python 3.11.
sudo apt install python3.11 -
Verify the installation.
python3.11 --version
-
curl -LsSf https://astral.sh/uv/install.sh | sh
uv venv --python 3.11
source .venv/bin/activate
uv add name-of-dependency
uv sync
In order to run the tests from the root project run:
bash ./local_test_pipeline.sh
This project uses .pre-commit hooks to ensure universal code formatting.
To install these use:
pre-commit install
The application uses Ollama for interacting with LLMs locally.
In order for this to work follow these steps:
-
Create
.envfile at the root of the project. See.env-samplefor the exact naming and properties. -
Download and install Ollama.
-
Open your terminal and execute the following command:
- Download the model:
ollama pull llama3:latest
- Start Ollama:
ollama serve
The system provides a rich CLI interface for running surveillance analysis:
# Analyze a city with basic settings
python main.py Berlin
# Specify country for disambiguation
python main.py Athens --country GR
# Use different analysis scenarios
python main.py Hamburg --scenario full
python main.py Munich --scenario quickbasic(default): Essential analysis producing key filesfull: Complete analysis with all visualizations and reportsquick: Fast analysis with minimal processingreport: Focus on statistical summaries and chartsmapping: Emphasis on geospatial visualizations
# Skip scraping (use existing data)
python main.py Berlin --data-path overpass_data/berlin/berlin.json --skip-scrape
# Skip analysis (scraping only)
python main.py Hamburg --skip-analyze
# Custom output directory
python main.py Paris --output-dir /custom/path
# Verbose logging
python main.py London --verboseThe analysis generates several files in the output directory:
- Enriched JSON: Original data enhanced with LLM analysis
- GeoJSON: Geographic data for mapping applications
- Heatmap: Spatial density visualization
- Hotspots: DBSCAN clustering results and plots
- Statistics: Summary charts and metrics
python main.py --help