This component handles copying data from Redis to DAOS (Distributed Asynchronous Object Storage) for persistent storage.
🚧 To be implemented by other team members
- Read traffic data from Redis
- Process and transform data as needed
- Store data persistently in DAOS
- Handle data lifecycle management
- Connect to same Redis instance as backend
- Read from Redis hashes (key format:
packet:{dest_ip}:{source_ip}:{timestamp})
Each packet in Redis is stored as a hash with fields:
packet:{dest_ip}:{source_ip}:{timestamp}
- timestamp: Unix timestamp
- source_ip: Source IP address
- dest_ip: Destination IP address
- total_bytes: Total bytes (string)
- udp_packets: JSON array
- udp_bytes: JSON array
- tcp_packets: JSON array
- tcp_bytes: JSON array
Use the shared traffic simulator for testing:
cd ../traffic-simulator
./setup.sh
source venv/bin/activate
python simulator.py --redis-host localhostSee ../traffic-simulator/README.md for details.
- DAOS client libraries
- Redis connection (localhost:6379 or remote)
- Access to traffic simulator
- Start Redis:
docker run -d -p 6379:6379 redis/redis-stack-server:latest - Generate test data: Run traffic simulator
- Develop DAOS client to read and store data
- Verify data persistence in DAOS
- Main project:
../README.md - Backend API:
../backend/README.md - Traffic simulator:
../traffic-simulator/README.md