The iot_fetcher project is designed to collect and process data from various IoT devices. It provides a framework for fetching data from sensors, storing it in a database, and analyzing it to gain insights. The project aims to simplify the integration and management of IoT devices in a scalable and efficient manner.
- Aqualink - Connects to pool pump to fetch temperature and settings
- Airquality - Connection to Google Maps Airquality API
- Balboa - Connects to SPA to fetch temperature and settings
- Elpris - Fetches energy price in Sweden SE 1-4
- Ngenic - Fetches temperature and settings from Ngenic
- Tapo - Connects to TP-Link Tapo smart devices to fetch power state, energy usage, and device metrics
- InfluxDB Backup - Automated backup system that exports all buckets to Google Cloud Storage every 12 hours
INFLUX_HOST- InfluxDB server hostname/IPINFLUX_TOKEN- InfluxDB authentication tokenINFLUX_ORG- InfluxDB organization nameINFLUX_BUCKET- InfluxDB bucket name for data storage
TAPO_EMAIL- Your TP-Link/Tapo account email addressTAPO_PASSWORD- Your TP-Link/Tapo account password
Run TAPO module individually for testing:
docker run --rm --env-file .env iot-fetcher:latest -- tapo- Schedule: Runs automatically every 12 hours
- Storage: Google Cloud Storage with date-based folders (
backup/YYYY-MM-DD/) - Process: Exports all buckets sequentially, compresses with gzip, uploads to GCS
- Memory optimized: Processes one bucket at a time to minimize memory usage
GOOGLE_BACKUP_URI=gcs://your-bucket/backup/%Y-%m-%d/
GOOGLE_SERVICE_ACCOUNT='{"type":"service_account",...}'# Run backup manually inside the container
docker exec iot-fetcher python python/src/main.py backup_influx# Download backup from GCS (on host machine)
gsutil cp gs://your-bucket/backup/2025-08-20/bucket_name_timestamp.tar.gz ./backup.tar.gz
# Extract backup locally
tar -xzf backup.tar.gz
# Copy extracted backup into container
docker cp ./backup_bucket_name iot-fetcher:/tmp/backup_bucket_name
# Restore inside the container using influx CLI to new bucket
docker exec iot-fetcher influx restore /tmp/backup_bucket_name -o your-org -b your-bucket-old
# Migrate from new restored bucket into old one
docker exec iot-fetcher influx query -o your-org -q 'from(bucket: "you-bucket-old") |> range(start: -5y, stop: now()) |> to(bucket: "your-bucket")'The Node.js service automatically discovers and monitors TP-Link Tapo smart plugs using the TP-Link cloud API.
Add your TP-Link credentials to your .env file:
[email protected]
TAPO_PASSWORD=your-tp-link-password- Device Status: Power on/off state, uptime
- Signal Quality: WiFi signal level and RSSI
- Energy Usage: Current power consumption, daily and monthly energy totals
- Device Information: Device ID, MAC address, alias, model
- Runs every 10 minutes
- Automatically discovers new devices
- Stores data in InfluxDB under measurement
tapo_device
make buildBuild the container an tag itmake runStart the application inside the docker containermake run MODULE=<module-name>Start a specifc module onlymake pushBuild and push the containermake devRebuild container with pydebugger installed and start application with debugger on port5678