Skip to content

Commit 32b2425

Browse files
authored
Merge pull request #71 from jethronap/57_orchestration_pipeline
Added pipeline config (#57)
2 parents 394f763 + 3a1aac7 commit 32b2425

22 files changed

+1516
-2859
lines changed

README.md

Lines changed: 73 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -1,26 +1,21 @@
1-
# UNDO - Agentic counter-surveillance
1+
# Agentic Counter-Surveillance Analysis
22

3-
The application is agentic system that downloads data from various online sources, analyzes it and presents the results.
4-
Everything is executed locally and no-external APIs are needed. The agents used in this software create and store
5-
memories of their actions.
3+
A multi-agent system for analyzing surveillance infrastructure using OpenStreetMap data. The system operates completely locally without external APIs and uses LangChain-based agents that create memories of their actions.
64

7-
# Scraper agent
5+
## Overview
86

9-
The Scraper agent's goal is to get all available data that exist on OpenStreet maps concerning surveillance for a given
10-
city or municipality. The agent downloads data in json format and stores them locally on the filesystem.
7+
The pipeline consists of two main agents:
118

12-
# Analyzer agent
9+
- **Scraper Agent**: Downloads surveillance data from OpenStreetMap via Overpass API
10+
- **Analyzer Agent**: Enriches data using local LLM analysis and generates visualizations
1311

14-
The Analyzer agent's goal is to get scraped data from the Scraper agent and:
15-
16-
- Enrich them by doing analysis using and LLM. The agent creates new file with structured information.
17-
- Produces a `.geojson` file to be used with mapping.
18-
- Creates a `heatmat` of the given area showing the spatial density of the surveillance infrastructure
19-
- Computes simple summary statistics from the available data.
20-
- Computes surveillance hotspots using DBSCAN and plots them overlaid on an OSM map
21-
- Plots sensitivity reasons distributions coming from a LLM
22-
- Plots camera counts for sensitive zones
23-
- Plots the distribution of private and public cameras using a donut-chart
12+
**Key Features:**
13+
- Local LLM processing (no external APIs)
14+
- Persistent agent memory with SQLite
15+
- Multiple analysis scenarios (basic, full, quick, report, mapping)
16+
- Rich CLI interface with progress tracking
17+
- Automatic caching to avoid re-downloading data
18+
- Comprehensive visualizations (heatmaps, hotspots, charts)
2419

2520
# Installation
2621

@@ -117,10 +112,68 @@ In order for this to work follow these steps:
117112
- Download the model:
118113

119114
```commandline
120-
ollama pull gemma2
115+
ollama pull llama3:latest
121116
```
122117
- Start Ollama:
123118

124119
```commandline
125120
ollama serve
126-
```
121+
```
122+
123+
## Usage
124+
125+
The system provides a rich CLI interface for running surveillance analysis:
126+
127+
### Basic Usage
128+
129+
```bash
130+
# Analyze a city with basic settings
131+
python main.py Berlin
132+
133+
# Specify country for disambiguation
134+
python main.py Athens --country GR
135+
136+
# Use different analysis scenarios
137+
python main.py Hamburg --scenario full
138+
python main.py Munich --scenario quick
139+
```
140+
141+
### Analysis Scenarios
142+
143+
- `basic` (default): Essential analysis producing key files
144+
- `full`: Complete analysis with all visualizations and reports
145+
- `quick`: Fast analysis with minimal processing
146+
- `report`: Focus on statistical summaries and charts
147+
- `mapping`: Emphasis on geospatial visualizations
148+
149+
### Advanced Options
150+
151+
```bash
152+
# Skip scraping (use existing data)
153+
python main.py Berlin --data-path overpass_data/berlin/berlin.json --skip-scrape
154+
155+
# Skip analysis (scraping only)
156+
python main.py Hamburg --skip-analyze
157+
158+
# Custom output directory
159+
python main.py Paris --output-dir /custom/path
160+
161+
# Verbose logging
162+
python main.py London --verbose
163+
```
164+
165+
### Output Files
166+
167+
The analysis generates several files in the output directory:
168+
169+
- **Enriched JSON**: Original data enhanced with LLM analysis
170+
- **GeoJSON**: Geographic data for mapping applications
171+
- **Heatmap**: Spatial density visualization
172+
- **Hotspots**: DBSCAN clustering results and plots
173+
- **Statistics**: Summary charts and metrics
174+
175+
### Help
176+
177+
```bash
178+
python main.py --help
179+
```

local_test_pipeline.sh

Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -69,13 +69,3 @@ echo "Running LangChain configuration tests"
6969
pytest tests/config/test_langchain_config.py
7070
echo "Done..."
7171
echo "==============================================="
72-
73-
echo "Running LangChain wrapper tools tests"
74-
pytest tests/tools/test_langchain_tools.py
75-
echo "Done..."
76-
echo "==============================================="
77-
78-
echo "Running LangChain memory adapter tests"
79-
pytest tests/memory/test_langchain_adapter.py
80-
echo "Done..."
81-
echo "==============================================="

0 commit comments

Comments
 (0)