diff --git a/echo/README.md b/echo/README.md new file mode 100644 index 000000000..927483ee9 --- /dev/null +++ b/echo/README.md @@ -0,0 +1,261 @@ +# Echo Protocol Interoperability Tests + +Pure-bash implementation of Echo protocol interoperability tests for libp2p implementations. + +## Overview + +The echo test suite validates that different libp2p implementations can successfully communicate using the Echo protocol (`/echo/1.0.0`). This test specifically focuses on **js-libp2p** and **py-libp2p** interoperability, addressing the Universal Connectivity initiative requirements for 2026. + +**Implementations tested**: +- **js-libp2p**: v1.x (Node.js Echo Server) +- **py-libp2p**: v0.5.0 (Python Test Harness) + +**Transport protocols**: +- **tcp**: TCP transport + +**Security and multiplexing**: +- **Secure Channels**: noise +- **Muxers**: yamux, mplex + +## What It Measures + +The Echo protocol test validates **full bidirectional stream read/write capabilities** including: + +- **Stream Handling**: Proper stream setup, read, and write operations +- **Muxing**: Stream multiplexing over a single connection +- **Window Management**: Flow control and backpressure handling +- **Payload Integrity**: Exact echo verification across different data types +- **Connection Lifecycle**: Proper setup and teardown + +Each test runs a **JavaScript Echo Server** against a **Python Test Client** using Docker containers and measures: + +- **Connectivity**: Can client connect to server successfully? +- **Protocol Compliance**: Does server properly handle `/echo/1.0.0` protocol? +- **Data Integrity**: Does echoed data exactly match original payload? +- **Payload Types**: Text, binary, and large data handling +- **Error Handling**: Timeout and failure scenario management + +## Why Echo vs Ping? + +Echo protocol testing is critical because it tests **full stream capabilities** that simple Ping tests often miss: + +- **Stream Muxing**: Multiple concurrent streams over one connection +- **Window Management**: Flow control and backpressure scenarios +- **Payload Integrity**: Large data transfers with verification +- **Edge Cases**: Yamux hangs, stream reset handling, partial reads/writes + +Ping tests only verify basic connectivity, while Echo tests validate the complete stream handling stack. + +## Test Cases + +The test harness validates multiple payload scenarios: + +```python +test_cases = [ + b"Hello, Echo!", # Text payload + b"\x00\x01\x02\x03\x04", # Binary data + b"A" * 1024, # Large payload (1KB) +] +``` + +Each test case verifies: +1. **Connection establishment** between py-libp2p client and js-libp2p server +2. **Protocol negotiation** for `/echo/1.0.0` +3. **Data transmission** from client to server +4. **Echo response** from server back to client +5. **Payload verification** ensuring exact match +6. **Connection cleanup** and resource management + +## How to Run Tests + +### Prerequisites + +Check dependencies: +```bash +./run.sh --check-deps +``` + +Required: bash 4.0+, docker 20.10+, yq 4.0+, redis (for coordination) + +### Basic Usage + +```bash +# Run all echo interop tests +./run.sh + +# Run with debug output +./run.sh --debug + +# Run with custom timeout +./run.sh --timeout 300 + +# Run specific test combinations +./run.sh --test-select "js-libp2p-echo-v1.x" +``` + +### Docker-based Testing + +For development and debugging: + +```bash +# Run complete Docker orchestration test +./test-echo.sh + +# This will: +# 1. Start Redis coordination service +# 2. Build and start JS echo server container +# 3. Run Python test client container +# 4. Verify echo protocol functionality +# 5. Clean up all containers and networks +``` + +### Manual Testing + +For step-by-step debugging: + +```bash +# Build Docker images +docker build -t js-libp2p-echo:v1.x images/js-libp2p/v1.x/ +docker build -t py-libp2p-echo:v0.x images/py-libp2p/v0.x/ + +# Start Redis +docker run -d --name redis-test -p 6379:6379 redis:alpine + +# Start JS echo server +docker run -d --name js-server \ + -e REDIS_ADDR=redis://localhost:6379 \ + js-libp2p-echo:v1.x + +# Run Python test client +docker run --rm --name py-client \ + -e REDIS_ADDR=redis://localhost:6379 \ + py-libp2p-echo:v0.x +``` + +## Architecture + +``` +echo-interop/ +├── run.sh # Framework integration entry point +├── test-echo.sh # Docker orchestration script +├── images.yaml # Implementation definitions +├── images/ +│ ├── js-libp2p/v1.x/ # JavaScript Echo Server +│ │ ├── Dockerfile # Node.js 18 Alpine container +│ │ ├── package.json # libp2p + redis dependencies +│ │ └── src/index.js # Echo server implementation +│ └── py-libp2p/v0.x/ # Python Test Harness +│ ├── Dockerfile # Python 3.11 container +│ ├── requirements.txt # libp2p + trio + redis deps +│ └── main.py # Trio-based test client +└── README.md # This file +``` + +### JavaScript Echo Server (`/echo/1.0.0`) + +The server implementation: +- **Listens** on configurable TCP port +- **Handles** `/echo/1.0.0` protocol requests +- **Reads** incoming stream data +- **Echoes** exact data back to client +- **Publishes** multiaddr to Redis for client discovery +- **Logs** all operations for debugging + +### Python Test Harness + +The client implementation: +- **Discovers** server multiaddr from Redis +- **Connects** to js-libp2p server +- **Negotiates** `/echo/1.0.0` protocol +- **Sends** multiple test payloads +- **Verifies** echo responses match exactly +- **Reports** results in structured JSON format + +### Coordination + +Redis is used for container coordination: +- **Server** publishes its multiaddr to `js-echo-server-multiaddr` key +- **Client** polls Redis until server multiaddr is available +- **Timeout** handling prevents indefinite waiting +- **Cleanup** removes coordination data after tests + +## Test Results + +Successful test output: +```json +{ + "test": "echo-protocol", + "transport": "tcp", + "security": "noise", + "muxer": "yamux", + "duration": 5.234, + "results": [ + {"status": "passed", "data_length": 13}, + {"status": "passed", "data_length": 5}, + {"status": "passed", "data_length": 1024} + ], + "passed": true +} +``` + +## Troubleshooting + +### Common Issues + +**Container build failures**: +```bash +# Clean Docker cache and rebuild +docker system prune -f +./test-echo.sh +``` + +**Redis connection errors**: +```bash +# Check Redis is running +docker ps | grep redis +# Check network connectivity +docker network ls +``` + +**libp2p version conflicts**: +```bash +# Check dependency versions +docker run --rm js-libp2p-echo:v1.x npm list +docker run --rm py-libp2p-echo:v0.x pip list +``` + +### Debug Mode + +Enable verbose logging: +```bash +# Framework debug +./run.sh --debug + +# Container debug +docker run --rm -e DEBUG=true js-libp2p-echo:v1.x +``` + +## Contributing + +This implementation follows test-plans conventions: + +1. **Framework Integration**: Uses `lib-test-execution.sh` +2. **Docker Containers**: Isolated, reproducible environments +3. **Configuration**: Structured `images.yaml` definitions +4. **Error Handling**: Proper cleanup and timeout management +5. **Documentation**: Comprehensive inline comments + +For modifications: +1. Update implementation code in `images/*/src/` +2. Rebuild Docker images +3. Test locally with `./test-echo.sh` +4. Verify framework integration with `./run.sh` + +## Related Work + +This implementation complements existing libp2p interop tests: +- **nim-libp2p**: Existing Echo protocol tests +- **go-libp2p**: Active Echo/Ping work (#1136) +- **rust-libp2p**: Active interop development (#1142) + +Part of the **Universal Connectivity 2026 initiative** to ensure comprehensive cross-implementation compatibility. \ No newline at end of file diff --git a/echo/images.yaml b/echo/images.yaml new file mode 100644 index 000000000..adc38ea65 --- /dev/null +++ b/echo/images.yaml @@ -0,0 +1,22 @@ +# Echo Protocol Interop - Implementation Definitions + +implementations: + # JavaScript Echo Server + - id: js-libp2p-echo-v1.x + source: + type: local + path: images/js-libp2p/v1.x + dockerfile: Dockerfile + transports: [tcp] + secureChannels: [noise] + muxers: [mplex, yamux] + + # Python Echo Client + - id: py-libp2p-echo-v0.x + source: + type: local + path: images/py-libp2p/v0.x + dockerfile: Dockerfile + transports: [tcp] + secureChannels: [noise] + muxers: [mplex, yamux] \ No newline at end of file diff --git a/echo/images/js-libp2p/v1.x/Dockerfile b/echo/images/js-libp2p/v1.x/Dockerfile new file mode 100644 index 000000000..d82d72763 --- /dev/null +++ b/echo/images/js-libp2p/v1.x/Dockerfile @@ -0,0 +1,31 @@ +FROM node:18-alpine + +# Install dumb-init for proper signal handling +RUN apk add --no-cache dumb-init + +# Create app directory +WORKDIR /app + +# Copy package files +COPY package*.json ./ + +# Install dependencies +RUN npm ci --only=production + +# Copy source code +COPY src/ ./src/ + +# Create non-root user +RUN addgroup -g 1001 -S nodejs && \ + adduser -S nodejs -u 1001 + +# Change ownership +RUN chown -R nodejs:nodejs /app +USER nodejs + +# Expose port +EXPOSE 0 + +# Use dumb-init to handle signals properly +ENTRYPOINT ["dumb-init", "--"] +CMD ["node", "src/index.js"] \ No newline at end of file diff --git a/echo/images/js-libp2p/v1.x/package-lock.json b/echo/images/js-libp2p/v1.x/package-lock.json new file mode 100644 index 000000000..d2d0f59dd --- /dev/null +++ b/echo/images/js-libp2p/v1.x/package-lock.json @@ -0,0 +1,888 @@ +{ + "name": "js-libp2p-echo-server", + "version": "1.0.0", + "lockfileVersion": 3, + "requires": true, + "packages": { + "": { + "name": "js-libp2p-echo-server", + "version": "1.0.0", + "license": "MIT", + "dependencies": { + "libp2p": "^3.1.3", + "redis": "^5.10.0" + } + }, + "node_modules/@chainsafe/is-ip": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/@chainsafe/is-ip/-/is-ip-2.1.0.tgz", + "integrity": "sha512-KIjt+6IfysQ4GCv66xihEitBjvhU/bixbbbFxdJ1sqCp4uJ0wuZiYBPhksZoy4lfaF0k9cwNzY5upEW/VWdw3w==", + "license": "MIT" + }, + "node_modules/@chainsafe/netmask": { + "version": "2.0.0", + "resolved": "https://registry.npmjs.org/@chainsafe/netmask/-/netmask-2.0.0.tgz", + "integrity": "sha512-I3Z+6SWUoaljh3TBzCnCxjlUyN8tA+NAk5L6m9IxvCf1BENQTePzPMis97CoN/iMW1St3WN+AWCCRp+TTBRiDg==", + "license": "MIT", + "dependencies": { + "@chainsafe/is-ip": "^2.0.1" + } + }, + "node_modules/@dnsquery/dns-packet": { + "version": "6.1.1", + "resolved": "https://registry.npmjs.org/@dnsquery/dns-packet/-/dns-packet-6.1.1.tgz", + "integrity": "sha512-WXTuFvL3G+74SchFAtz3FgIYVOe196ycvGsMgvSH/8Goptb1qpIQtIuM4SOK9G9lhMWYpHxnXyy544ZhluFOew==", + "license": "MIT", + "dependencies": { + "@leichtgewicht/ip-codec": "^2.0.4", + "utf8-codec": "^1.0.0" + }, + "engines": { + "node": ">=6" + } + }, + "node_modules/@leichtgewicht/ip-codec": { + "version": "2.0.5", + "resolved": "https://registry.npmjs.org/@leichtgewicht/ip-codec/-/ip-codec-2.0.5.tgz", + "integrity": "sha512-Vo+PSpZG2/fmgmiNzYK9qWRh8h/CHrwD0mo1h1DzL4yzHNSfWYujGTYsWGreD000gcgmZ7K4Ys6Tx9TxtsKdDw==", + "license": "MIT" + }, + "node_modules/@libp2p/crypto": { + "version": "5.1.13", + "resolved": "https://registry.npmjs.org/@libp2p/crypto/-/crypto-5.1.13.tgz", + "integrity": "sha512-8NN9cQP3jDn+p9+QE9ByiEoZ2lemDFf/unTgiKmS3JF93ph240EUVdbCyyEgOMfykzb0okTM4gzvwfx9osJebQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@libp2p/interface": "^3.1.0", + "@noble/curves": "^2.0.1", + "@noble/hashes": "^2.0.1", + "multiformats": "^13.4.0", + "protons-runtime": "^5.6.0", + "uint8arraylist": "^2.4.8", + "uint8arrays": "^5.1.0" + } + }, + "node_modules/@libp2p/interface": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/@libp2p/interface/-/interface-3.1.0.tgz", + "integrity": "sha512-RE7/XyvC47fQBe1cHxhMvepYKa5bFCUyFrrpj8PuM0E7JtzxU7F+Du5j4VXbg2yLDcToe0+j8mB7jvwE2AThYw==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@multiformats/dns": "^1.0.6", + "@multiformats/multiaddr": "^13.0.1", + "main-event": "^1.0.1", + "multiformats": "^13.4.0", + "progress-events": "^1.0.1", + "uint8arraylist": "^2.4.8" + } + }, + "node_modules/@libp2p/interface-internal": { + "version": "3.0.10", + "resolved": "https://registry.npmjs.org/@libp2p/interface-internal/-/interface-internal-3.0.10.tgz", + "integrity": "sha512-Gd/eQAoAlXqeCRJ6wOwcnTQ/SDe95bQow8osY8zq0nbfFBu26aChQHjAd+CjcCADJRh+Sd+7+dYG7BrhpxGt1A==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@libp2p/interface": "^3.1.0", + "@libp2p/peer-collections": "^7.0.10", + "@multiformats/multiaddr": "^13.0.1", + "progress-events": "^1.0.1" + } + }, + "node_modules/@libp2p/logger": { + "version": "6.2.2", + "resolved": "https://registry.npmjs.org/@libp2p/logger/-/logger-6.2.2.tgz", + "integrity": "sha512-XtanXDT+TuMuZoCK760HGV1AmJsZbwAw5AiRUxWDbsZPwAroYq64nb41AHRu9Gyc0TK9YD+p72+5+FIxbw0hzw==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@libp2p/interface": "^3.1.0", + "@multiformats/multiaddr": "^13.0.1", + "interface-datastore": "^9.0.1", + "multiformats": "^13.4.0", + "weald": "^1.1.0" + } + }, + "node_modules/@libp2p/multistream-select": { + "version": "7.0.10", + "resolved": "https://registry.npmjs.org/@libp2p/multistream-select/-/multistream-select-7.0.10.tgz", + "integrity": "sha512-6RAFctqWzwQ/qPaN3CxoueSs1b7pBVMZ+0n6G0kcsqVBj0wc4eB+dcJyUNrTV1NGgMCAl6tVAGztZaE8XZc9lw==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@libp2p/interface": "^3.1.0", + "@libp2p/utils": "^7.0.10", + "it-length-prefixed": "^10.0.1", + "uint8arraylist": "^2.4.8", + "uint8arrays": "^5.1.0" + } + }, + "node_modules/@libp2p/peer-collections": { + "version": "7.0.10", + "resolved": "https://registry.npmjs.org/@libp2p/peer-collections/-/peer-collections-7.0.10.tgz", + "integrity": "sha512-OvlSY5N3J6q8U+EbTrQGbW8zdyOa3y7nz9Y3IbuE55tIiMd7pwm1U3Lknfb6IPkOWkHNfQDfCGGfGVQcMRodvQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@libp2p/interface": "^3.1.0", + "@libp2p/peer-id": "^6.0.4", + "@libp2p/utils": "^7.0.10", + "multiformats": "^13.4.0" + } + }, + "node_modules/@libp2p/peer-id": { + "version": "6.0.4", + "resolved": "https://registry.npmjs.org/@libp2p/peer-id/-/peer-id-6.0.4.tgz", + "integrity": "sha512-Z3xK0lwwKn4bPg3ozEpPr1HxsRi2CxZdghOL+MXoFah/8uhJJHxHFA8A/jxtKn4BB8xkk6F8R5vKNIS05yaCYw==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@libp2p/crypto": "^5.1.13", + "@libp2p/interface": "^3.1.0", + "multiformats": "^13.4.0", + "uint8arrays": "^5.1.0" + } + }, + "node_modules/@libp2p/peer-record": { + "version": "9.0.5", + "resolved": "https://registry.npmjs.org/@libp2p/peer-record/-/peer-record-9.0.5.tgz", + "integrity": "sha512-disk23OO00yD52O4VmItbDkjJZ/YZJsKbMsqNgVhr+D3PcM+KRpu9VVbiCnN5Tzn9XvFEHhrMJY7BPE+rvT5MQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@libp2p/crypto": "^5.1.13", + "@libp2p/interface": "^3.1.0", + "@libp2p/peer-id": "^6.0.4", + "@multiformats/multiaddr": "^13.0.1", + "multiformats": "^13.4.0", + "protons-runtime": "^5.6.0", + "uint8-varint": "^2.0.4", + "uint8arraylist": "^2.4.8", + "uint8arrays": "^5.1.0" + } + }, + "node_modules/@libp2p/peer-store": { + "version": "12.0.10", + "resolved": "https://registry.npmjs.org/@libp2p/peer-store/-/peer-store-12.0.10.tgz", + "integrity": "sha512-fe/6m0vXny9pvCyaSjg2GisdSVgxtHYZtp6op1WNm8dBvYqRXLuqSYi0QGEbLtSDSL4SeE8BKZyadyk/tYAqfg==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@libp2p/crypto": "^5.1.13", + "@libp2p/interface": "^3.1.0", + "@libp2p/peer-collections": "^7.0.10", + "@libp2p/peer-id": "^6.0.4", + "@libp2p/peer-record": "^9.0.5", + "@multiformats/multiaddr": "^13.0.1", + "interface-datastore": "^9.0.1", + "it-all": "^3.0.9", + "main-event": "^1.0.1", + "mortice": "^3.3.1", + "multiformats": "^13.4.0", + "protons-runtime": "^5.6.0", + "uint8arraylist": "^2.4.8", + "uint8arrays": "^5.1.0" + } + }, + "node_modules/@libp2p/utils": { + "version": "7.0.10", + "resolved": "https://registry.npmjs.org/@libp2p/utils/-/utils-7.0.10.tgz", + "integrity": "sha512-+mzD+7yLMoZ8+34y/iS9d1CnwHjJJ/qEsao9FckHf9T9tnVXEyLLu9TpzBCcGRm4fUK/QCSHK2AcZH50kkAFkw==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@chainsafe/is-ip": "^2.1.0", + "@chainsafe/netmask": "^2.0.0", + "@libp2p/crypto": "^5.1.13", + "@libp2p/interface": "^3.1.0", + "@libp2p/logger": "^6.2.2", + "@multiformats/multiaddr": "^13.0.1", + "@sindresorhus/fnv1a": "^3.1.0", + "any-signal": "^4.1.1", + "cborg": "^4.2.14", + "delay": "^7.0.0", + "is-loopback-addr": "^2.0.2", + "it-length-prefixed": "^10.0.1", + "it-pipe": "^3.0.1", + "it-pushable": "^3.2.3", + "it-stream-types": "^2.0.2", + "main-event": "^1.0.1", + "netmask": "^2.0.2", + "p-defer": "^4.0.1", + "p-event": "^7.0.0", + "race-signal": "^2.0.0", + "uint8-varint": "^2.0.4", + "uint8arraylist": "^2.4.8", + "uint8arrays": "^5.1.0" + } + }, + "node_modules/@multiformats/dns": { + "version": "1.0.13", + "resolved": "https://registry.npmjs.org/@multiformats/dns/-/dns-1.0.13.tgz", + "integrity": "sha512-yr4bxtA3MbvJ+2461kYIYMsiiZj/FIqKI64hE4SdvWJUdWF9EtZLar38juf20Sf5tguXKFUruluswAO6JsjS2w==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@dnsquery/dns-packet": "^6.1.1", + "@libp2p/interface": "^3.1.0", + "hashlru": "^2.3.0", + "p-queue": "^9.0.0", + "progress-events": "^1.0.0", + "uint8arrays": "^5.0.2" + } + }, + "node_modules/@multiformats/multiaddr": { + "version": "13.0.1", + "resolved": "https://registry.npmjs.org/@multiformats/multiaddr/-/multiaddr-13.0.1.tgz", + "integrity": "sha512-XToN915cnfr6Lr9EdGWakGJbPT0ghpg/850HvdC+zFX8XvpLZElwa8synCiwa8TuvKNnny6m8j8NVBNCxhIO3g==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@chainsafe/is-ip": "^2.0.1", + "multiformats": "^13.0.0", + "uint8-varint": "^2.0.1", + "uint8arrays": "^5.0.0" + } + }, + "node_modules/@multiformats/multiaddr-matcher": { + "version": "3.0.1", + "resolved": "https://registry.npmjs.org/@multiformats/multiaddr-matcher/-/multiaddr-matcher-3.0.1.tgz", + "integrity": "sha512-jvjwzCPysVTQ53F4KqwmcqZw73BqHMk0UUZrMP9P4OtJ/YHrfs122ikTqhVA2upe0P/Qz9l8HVlhEifVYB2q9A==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@multiformats/multiaddr": "^13.0.0" + } + }, + "node_modules/@noble/curves": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/@noble/curves/-/curves-2.0.1.tgz", + "integrity": "sha512-vs1Az2OOTBiP4q0pwjW5aF0xp9n4MxVrmkFBxc6EKZc6ddYx5gaZiAsZoq0uRRXWbi3AT/sBqn05eRPtn1JCPw==", + "license": "MIT", + "dependencies": { + "@noble/hashes": "2.0.1" + }, + "engines": { + "node": ">= 20.19.0" + }, + "funding": { + "url": "https://paulmillr.com/funding/" + } + }, + "node_modules/@noble/hashes": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/@noble/hashes/-/hashes-2.0.1.tgz", + "integrity": "sha512-XlOlEbQcE9fmuXxrVTXCTlG2nlRXa9Rj3rr5Ue/+tX+nmkgbX720YHh0VR3hBF9xDvwnb8D2shVGOwNx+ulArw==", + "license": "MIT", + "engines": { + "node": ">= 20.19.0" + }, + "funding": { + "url": "https://paulmillr.com/funding/" + } + }, + "node_modules/@redis/bloom": { + "version": "5.10.0", + "resolved": "https://registry.npmjs.org/@redis/bloom/-/bloom-5.10.0.tgz", + "integrity": "sha512-doIF37ob+l47n0rkpRNgU8n4iacBlKM9xLiP1LtTZTvz8TloJB8qx/MgvhMhKdYG+CvCY2aPBnN2706izFn/4A==", + "license": "MIT", + "engines": { + "node": ">= 18" + }, + "peerDependencies": { + "@redis/client": "^5.10.0" + } + }, + "node_modules/@redis/client": { + "version": "5.10.0", + "resolved": "https://registry.npmjs.org/@redis/client/-/client-5.10.0.tgz", + "integrity": "sha512-JXmM4XCoso6C75Mr3lhKA3eNxSzkYi3nCzxDIKY+YOszYsJjuKbFgVtguVPbLMOttN4iu2fXoc2BGhdnYhIOxA==", + "license": "MIT", + "dependencies": { + "cluster-key-slot": "1.1.2" + }, + "engines": { + "node": ">= 18" + } + }, + "node_modules/@redis/json": { + "version": "5.10.0", + "resolved": "https://registry.npmjs.org/@redis/json/-/json-5.10.0.tgz", + "integrity": "sha512-B2G8XlOmTPUuZtD44EMGbtoepQG34RCDXLZbjrtON1Djet0t5Ri7/YPXvL9aomXqP8lLTreaprtyLKF4tmXEEA==", + "license": "MIT", + "engines": { + "node": ">= 18" + }, + "peerDependencies": { + "@redis/client": "^5.10.0" + } + }, + "node_modules/@redis/search": { + "version": "5.10.0", + "resolved": "https://registry.npmjs.org/@redis/search/-/search-5.10.0.tgz", + "integrity": "sha512-3SVcPswoSfp2HnmWbAGUzlbUPn7fOohVu2weUQ0S+EMiQi8jwjL+aN2p6V3TI65eNfVsJ8vyPvqWklm6H6esmg==", + "license": "MIT", + "engines": { + "node": ">= 18" + }, + "peerDependencies": { + "@redis/client": "^5.10.0" + } + }, + "node_modules/@redis/time-series": { + "version": "5.10.0", + "resolved": "https://registry.npmjs.org/@redis/time-series/-/time-series-5.10.0.tgz", + "integrity": "sha512-cPkpddXH5kc/SdRhF0YG0qtjL+noqFT0AcHbQ6axhsPsO7iqPi1cjxgdkE9TNeKiBUUdCaU1DbqkR/LzbzPBhg==", + "license": "MIT", + "engines": { + "node": ">= 18" + }, + "peerDependencies": { + "@redis/client": "^5.10.0" + } + }, + "node_modules/@sindresorhus/fnv1a": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/@sindresorhus/fnv1a/-/fnv1a-3.1.0.tgz", + "integrity": "sha512-KV321z5m/0nuAg83W1dPLy85HpHDk7Sdi4fJbwvacWsEhAh+rZUW4ZfGcXmUIvjZg4ss2bcwNlRhJ7GBEUG08w==", + "license": "MIT", + "engines": { + "node": "^12.20.0 || ^14.13.1 || >=16.0.0" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/abort-error": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/abort-error/-/abort-error-1.0.1.tgz", + "integrity": "sha512-fxqCblJiIPdSXIUrxI0PL+eJG49QdP9SQ70qtB65MVAoMr2rASlOyAbJFOylfB467F/f+5BCLJJq58RYi7mGfg==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/any-signal": { + "version": "4.2.0", + "resolved": "https://registry.npmjs.org/any-signal/-/any-signal-4.2.0.tgz", + "integrity": "sha512-LndMvYuAPf4rC195lk7oSFuHOYFpOszIYrNYv0gHAvz+aEhE9qPZLhmrIz5pXP2BSsPOXvsuHDXEGaiQhIh9wA==", + "license": "Apache-2.0 OR MIT", + "engines": { + "node": ">=16.0.0", + "npm": ">=7.0.0" + } + }, + "node_modules/cborg": { + "version": "4.5.8", + "resolved": "https://registry.npmjs.org/cborg/-/cborg-4.5.8.tgz", + "integrity": "sha512-6/viltD51JklRhq4L7jC3zgy6gryuG5xfZ3kzpE+PravtyeQLeQmCYLREhQH7pWENg5pY4Yu/XCd6a7dKScVlw==", + "license": "Apache-2.0", + "bin": { + "cborg": "lib/bin.js" + } + }, + "node_modules/cluster-key-slot": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/cluster-key-slot/-/cluster-key-slot-1.1.2.tgz", + "integrity": "sha512-RMr0FhtfXemyinomL4hrWcYJxmX6deFdCxpJzhDttxgO1+bcCnkk+9drydLVDmAMG7NE6aN/fl4F7ucU/90gAA==", + "license": "Apache-2.0", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/datastore-core": { + "version": "11.0.2", + "resolved": "https://registry.npmjs.org/datastore-core/-/datastore-core-11.0.2.tgz", + "integrity": "sha512-0pN4hMcaCWcnUBo5OL/8j14Lt1l/p1v2VvzryRYeJAKRLqnFrzy2FhAQ7y0yTA63ki760ImQHfm2XlZrfIdFpQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@libp2p/logger": "^6.0.0", + "interface-datastore": "^9.0.0", + "interface-store": "^7.0.0", + "it-drain": "^3.0.9", + "it-filter": "^3.1.3", + "it-map": "^3.1.3", + "it-merge": "^3.0.11", + "it-pipe": "^3.0.1", + "it-sort": "^3.0.8", + "it-take": "^3.0.8" + } + }, + "node_modules/delay": { + "version": "7.0.0", + "resolved": "https://registry.npmjs.org/delay/-/delay-7.0.0.tgz", + "integrity": "sha512-C3vaGs818qzZjCvVJ98GQUMVyWeg7dr5w2Nwwb2t5K8G98jOyyVO2ti2bKYk5yoYElqH3F2yA53ykuEnwD6MCg==", + "license": "MIT", + "dependencies": { + "random-int": "^3.1.0", + "unlimited-timeout": "^0.1.0" + }, + "engines": { + "node": ">=20" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/eventemitter3": { + "version": "5.0.4", + "resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-5.0.4.tgz", + "integrity": "sha512-mlsTRyGaPBjPedk6Bvw+aqbsXDtoAyAzm5MO7JgU+yVRyMQ5O8bD4Kcci7BS85f93veegeCPkL8R4GLClnjLFw==", + "license": "MIT" + }, + "node_modules/hashlru": { + "version": "2.3.0", + "resolved": "https://registry.npmjs.org/hashlru/-/hashlru-2.3.0.tgz", + "integrity": "sha512-0cMsjjIC8I+D3M44pOQdsy0OHXGLVz6Z0beRuufhKa0KfaD2wGwAev6jILzXsd3/vpnNQJmWyZtIILqM1N+n5A==", + "license": "MIT" + }, + "node_modules/interface-datastore": { + "version": "9.0.2", + "resolved": "https://registry.npmjs.org/interface-datastore/-/interface-datastore-9.0.2.tgz", + "integrity": "sha512-jebn+GV/5LTDDoyicNIB4D9O0QszpPqT09Z/MpEWvf3RekjVKpXJCDguM5Au2fwIFxFDAQMZe5bSla0jMamCNg==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "interface-store": "^7.0.0", + "uint8arrays": "^5.1.0" + } + }, + "node_modules/interface-store": { + "version": "7.0.1", + "resolved": "https://registry.npmjs.org/interface-store/-/interface-store-7.0.1.tgz", + "integrity": "sha512-OPRRUO3Cs6Jr/t98BrJLQp1jUTPgrRH0PqFfuNoPAqd+J7ABN1tjFVjQdaOBiybYJTS/AyBSZnZVWLPvp3dW3w==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/is-loopback-addr": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/is-loopback-addr/-/is-loopback-addr-2.0.2.tgz", + "integrity": "sha512-26POf2KRCno/KTNL5Q0b/9TYnL00xEsSaLfiFRmjM7m7Lw7ZMmFybzzuX4CcsLAluZGd+niLUiMRxEooVE3aqg==", + "license": "MIT" + }, + "node_modules/is-network-error": { + "version": "1.3.0", + "resolved": "https://registry.npmjs.org/is-network-error/-/is-network-error-1.3.0.tgz", + "integrity": "sha512-6oIwpsgRfnDiyEDLMay/GqCl3HoAtH5+RUKW29gYkL0QA+ipzpDLA16yQs7/RHCSu+BwgbJaOUqa4A99qNVQVw==", + "license": "MIT", + "engines": { + "node": ">=16" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/it-all": { + "version": "3.0.9", + "resolved": "https://registry.npmjs.org/it-all/-/it-all-3.0.9.tgz", + "integrity": "sha512-fz1oJJ36ciGnu2LntAlE6SA97bFZpW7Rnt0uEc1yazzR2nKokZLr8lIRtgnpex4NsmaBcvHF+Z9krljWFy/mmg==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/it-drain": { + "version": "3.0.10", + "resolved": "https://registry.npmjs.org/it-drain/-/it-drain-3.0.10.tgz", + "integrity": "sha512-0w/bXzudlyKIyD1+rl0xUKTI7k4cshcS43LTlBiGFxI8K1eyLydNPxGcsVLsFVtKh1/ieS8AnVWt6KwmozxyEA==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/it-filter": { + "version": "3.1.4", + "resolved": "https://registry.npmjs.org/it-filter/-/it-filter-3.1.4.tgz", + "integrity": "sha512-80kWEKgiFEa4fEYD3mwf2uygo1dTQ5Y5midKtL89iXyjinruA/sNXl6iFkTcdNedydjvIsFhWLiqRPQP4fAwWQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "it-peekable": "^3.0.0" + } + }, + "node_modules/it-length-prefixed": { + "version": "10.0.1", + "resolved": "https://registry.npmjs.org/it-length-prefixed/-/it-length-prefixed-10.0.1.tgz", + "integrity": "sha512-BhyluvGps26u9a7eQIpOI1YN7mFgi8lFwmiPi07whewbBARKAG9LE09Odc8s1Wtbt2MB6rNUrl7j9vvfXTJwdQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "it-reader": "^6.0.1", + "it-stream-types": "^2.0.1", + "uint8-varint": "^2.0.1", + "uint8arraylist": "^2.0.0", + "uint8arrays": "^5.0.1" + }, + "engines": { + "node": ">=16.0.0", + "npm": ">=7.0.0" + } + }, + "node_modules/it-map": { + "version": "3.1.4", + "resolved": "https://registry.npmjs.org/it-map/-/it-map-3.1.4.tgz", + "integrity": "sha512-QB9PYQdE9fUfpVFYfSxBIyvKynUCgblb143c+ktTK6ZuKSKkp7iH58uYFzagqcJ5HcqIfn1xbfaralHWam+3fg==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "it-peekable": "^3.0.0" + } + }, + "node_modules/it-merge": { + "version": "3.0.12", + "resolved": "https://registry.npmjs.org/it-merge/-/it-merge-3.0.12.tgz", + "integrity": "sha512-nnnFSUxKlkZVZD7c0jYw6rDxCcAQYcMsFj27thf7KkDhpj0EA0g9KHPxbFzHuDoc6US2EPS/MtplkNj8sbCx4Q==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "it-queueless-pushable": "^2.0.0" + } + }, + "node_modules/it-parallel": { + "version": "3.0.13", + "resolved": "https://registry.npmjs.org/it-parallel/-/it-parallel-3.0.13.tgz", + "integrity": "sha512-85PPJ/O8q97Vj9wmDTSBBXEkattwfQGruXitIzrh0RLPso6RHfiVqkuTqBNufYYtB1x6PSkh0cwvjmMIkFEPHA==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "p-defer": "^4.0.1" + } + }, + "node_modules/it-peekable": { + "version": "3.0.8", + "resolved": "https://registry.npmjs.org/it-peekable/-/it-peekable-3.0.8.tgz", + "integrity": "sha512-7IDBQKSp/dtBxXV3Fj0v3qM1jftJ9y9XrWLRIuU1X6RdKqWiN60syNwP0fiDxZD97b8SYM58dD3uklIk1TTQAw==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/it-pipe": { + "version": "3.0.1", + "resolved": "https://registry.npmjs.org/it-pipe/-/it-pipe-3.0.1.tgz", + "integrity": "sha512-sIoNrQl1qSRg2seYSBH/3QxWhJFn9PKYvOf/bHdtCBF0bnghey44VyASsWzn5dAx0DCDDABq1hZIuzKmtBZmKA==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "it-merge": "^3.0.0", + "it-pushable": "^3.1.2", + "it-stream-types": "^2.0.1" + }, + "engines": { + "node": ">=16.0.0", + "npm": ">=7.0.0" + } + }, + "node_modules/it-pushable": { + "version": "3.2.3", + "resolved": "https://registry.npmjs.org/it-pushable/-/it-pushable-3.2.3.tgz", + "integrity": "sha512-gzYnXYK8Y5t5b/BnJUr7glfQLO4U5vyb05gPx/TyTw+4Bv1zM9gFk4YsOrnulWefMewlphCjKkakFvj1y99Tcg==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "p-defer": "^4.0.0" + } + }, + "node_modules/it-queue": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/it-queue/-/it-queue-1.1.1.tgz", + "integrity": "sha512-yeYCV22WF1QDyb3ylw+g3TGEdkmnoHUH2mc12QoGOQuxW4XP1V7Zd3BfsEF1iq2IFBwIK7wCPUcRLTAQVeZ3SQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "abort-error": "^1.0.1", + "it-pushable": "^3.2.3", + "main-event": "^1.0.0", + "race-event": "^1.3.0", + "race-signal": "^2.0.0" + } + }, + "node_modules/it-queueless-pushable": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/it-queueless-pushable/-/it-queueless-pushable-2.0.3.tgz", + "integrity": "sha512-USa5EzTvmQswOcVE7+o6qsj2o2G+6KHCxSogPOs23sGYkDWFidhqVO7dAvv6ve/Z+Q+nvxpEa9rrRo6VEK7w4Q==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "abort-error": "^1.0.1", + "p-defer": "^4.0.1", + "race-signal": "^2.0.0" + } + }, + "node_modules/it-reader": { + "version": "6.0.4", + "resolved": "https://registry.npmjs.org/it-reader/-/it-reader-6.0.4.tgz", + "integrity": "sha512-XCWifEcNFFjjBHtor4Sfaj8rcpt+FkY0L6WdhD578SCDhV4VUm7fCkF3dv5a+fTcfQqvN9BsxBTvWbYO6iCjTg==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "it-stream-types": "^2.0.1", + "uint8arraylist": "^2.0.0" + }, + "engines": { + "node": ">=16.0.0", + "npm": ">=7.0.0" + } + }, + "node_modules/it-sort": { + "version": "3.0.9", + "resolved": "https://registry.npmjs.org/it-sort/-/it-sort-3.0.9.tgz", + "integrity": "sha512-jsM6alGaPiQbcAJdzMsuMh00uJcI+kD9TBoScB8TR75zUFOmHvhSsPi+Dmh2zfVkcoca+14EbfeIZZXTUGH63w==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "it-all": "^3.0.0" + } + }, + "node_modules/it-stream-types": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/it-stream-types/-/it-stream-types-2.0.2.tgz", + "integrity": "sha512-Rz/DEZ6Byn/r9+/SBCuJhpPATDF9D+dz5pbgSUyBsCDtza6wtNATrz/jz1gDyNanC3XdLboriHnOC925bZRBww==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/it-take": { + "version": "3.0.9", + "resolved": "https://registry.npmjs.org/it-take/-/it-take-3.0.9.tgz", + "integrity": "sha512-XMeUbnjOcgrhFXPUqa7H0VIjYSV/BvyxxjCp76QHVAFDJw2LmR1SHxUFiqyGeobgzJr7P2ZwSRRJQGn4D2BVlA==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/libp2p": { + "version": "3.1.3", + "resolved": "https://registry.npmjs.org/libp2p/-/libp2p-3.1.3.tgz", + "integrity": "sha512-Jgl6Km1PfFTKR7krDNDxuuxQ6ya3D6VHFOi/XYJA539F62PmbxOQLd+nqbqozwB9BgJVTxaXRVmGTKo7dyrdQw==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "@chainsafe/is-ip": "^2.1.0", + "@chainsafe/netmask": "^2.0.0", + "@libp2p/crypto": "^5.1.13", + "@libp2p/interface": "^3.1.0", + "@libp2p/interface-internal": "^3.0.10", + "@libp2p/logger": "^6.2.2", + "@libp2p/multistream-select": "^7.0.10", + "@libp2p/peer-collections": "^7.0.10", + "@libp2p/peer-id": "^6.0.4", + "@libp2p/peer-store": "^12.0.10", + "@libp2p/utils": "^7.0.10", + "@multiformats/dns": "^1.0.6", + "@multiformats/multiaddr": "^13.0.1", + "@multiformats/multiaddr-matcher": "^3.0.1", + "any-signal": "^4.1.1", + "datastore-core": "^11.0.1", + "interface-datastore": "^9.0.1", + "it-merge": "^3.0.12", + "it-parallel": "^3.0.13", + "main-event": "^1.0.1", + "multiformats": "^13.4.0", + "p-defer": "^4.0.1", + "p-event": "^7.0.0", + "p-retry": "^7.0.0", + "progress-events": "^1.0.1", + "race-signal": "^2.0.0", + "uint8arrays": "^5.1.0" + } + }, + "node_modules/main-event": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/main-event/-/main-event-1.0.1.tgz", + "integrity": "sha512-NWtdGrAca/69fm6DIVd8T9rtfDII4Q8NQbIbsKQq2VzS9eqOGYs8uaNQjcuaCq/d9H/o625aOTJX2Qoxzqw0Pw==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/mortice": { + "version": "3.3.1", + "resolved": "https://registry.npmjs.org/mortice/-/mortice-3.3.1.tgz", + "integrity": "sha512-t3oESfijIPGsmsdLEKjF+grHfrbnKSXflJtgb1wY14cjxZpS6GnhHRXTxxzCAoCCnq1YYfpEPwY3gjiCPhOufQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "abort-error": "^1.0.0", + "it-queue": "^1.1.0", + "main-event": "^1.0.0" + } + }, + "node_modules/ms": { + "version": "3.0.0-canary.202508261828", + "resolved": "https://registry.npmjs.org/ms/-/ms-3.0.0-canary.202508261828.tgz", + "integrity": "sha512-NotsCoUCIUkojWCzQff4ttdCfIPoA1UGZsyQbi7KmqkNRfKCrvga8JJi2PknHymHOuor0cJSn/ylj52Cbt2IrQ==", + "license": "MIT", + "engines": { + "node": ">=18" + } + }, + "node_modules/multiformats": { + "version": "13.4.2", + "resolved": "https://registry.npmjs.org/multiformats/-/multiformats-13.4.2.tgz", + "integrity": "sha512-eh6eHCrRi1+POZ3dA+Dq1C6jhP1GNtr9CRINMb67OKzqW9I5DUuZM/3jLPlzhgpGeiNUlEGEbkCYChXMCc/8DQ==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/netmask": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/netmask/-/netmask-2.0.2.tgz", + "integrity": "sha512-dBpDMdxv9Irdq66304OLfEmQ9tbNRFnFTuZiLo+bD+r332bBmMJ8GBLXklIXXgxd3+v9+KUnZaUR5PJMa75Gsg==", + "license": "MIT", + "engines": { + "node": ">= 0.4.0" + } + }, + "node_modules/p-defer": { + "version": "4.0.1", + "resolved": "https://registry.npmjs.org/p-defer/-/p-defer-4.0.1.tgz", + "integrity": "sha512-Mr5KC5efvAK5VUptYEIopP1bakB85k2IWXaRC0rsh1uwn1L6M0LVml8OIQ4Gudg4oyZakf7FmeRLkMMtZW1i5A==", + "license": "MIT", + "engines": { + "node": ">=12" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/p-event": { + "version": "7.1.0", + "resolved": "https://registry.npmjs.org/p-event/-/p-event-7.1.0.tgz", + "integrity": "sha512-/lkPs5W1aC3cp6vqZefpdosOn65J571sWodyfOQiF0+tmDCpU+H8Atwpu0vQROCVUlZuToDN5eyTLsMLLc54mg==", + "license": "MIT", + "dependencies": { + "p-timeout": "^7.0.1" + }, + "engines": { + "node": ">=20" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/p-queue": { + "version": "9.1.0", + "resolved": "https://registry.npmjs.org/p-queue/-/p-queue-9.1.0.tgz", + "integrity": "sha512-O/ZPaXuQV29uSLbxWBGGZO1mCQXV2BLIwUr59JUU9SoH76mnYvtms7aafH/isNSNGwuEfP6W/4xD0/TJXxrizw==", + "license": "MIT", + "dependencies": { + "eventemitter3": "^5.0.1", + "p-timeout": "^7.0.0" + }, + "engines": { + "node": ">=20" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/p-retry": { + "version": "7.1.1", + "resolved": "https://registry.npmjs.org/p-retry/-/p-retry-7.1.1.tgz", + "integrity": "sha512-J5ApzjyRkkf601HpEeykoiCvzHQjWxPAHhyjFcEUP2SWq0+35NKh8TLhpLw+Dkq5TZBFvUM6UigdE9hIVYTl5w==", + "license": "MIT", + "dependencies": { + "is-network-error": "^1.1.0" + }, + "engines": { + "node": ">=20" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/p-timeout": { + "version": "7.0.1", + "resolved": "https://registry.npmjs.org/p-timeout/-/p-timeout-7.0.1.tgz", + "integrity": "sha512-AxTM2wDGORHGEkPCt8yqxOTMgpfbEHqF51f/5fJCmwFC3C/zNcGT63SymH2ttOAaiIws2zVg4+izQCjrakcwHg==", + "license": "MIT", + "engines": { + "node": ">=20" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/progress-events": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/progress-events/-/progress-events-1.0.1.tgz", + "integrity": "sha512-MOzLIwhpt64KIVN64h1MwdKWiyKFNc/S6BoYKPIVUHFg0/eIEyBulhWCgn678v/4c0ri3FdGuzXymNCv02MUIw==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/protons-runtime": { + "version": "5.6.0", + "resolved": "https://registry.npmjs.org/protons-runtime/-/protons-runtime-5.6.0.tgz", + "integrity": "sha512-/Kde+sB9DsMFrddJT/UZWe6XqvL7SL5dbag/DBCElFKhkwDj7XKt53S+mzLyaDP5OqS0wXjV5SA572uWDaT0Hg==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "uint8-varint": "^2.0.2", + "uint8arraylist": "^2.4.3", + "uint8arrays": "^5.0.1" + } + }, + "node_modules/race-event": { + "version": "1.6.1", + "resolved": "https://registry.npmjs.org/race-event/-/race-event-1.6.1.tgz", + "integrity": "sha512-vi7WH5g5KoTFpu2mme/HqZiWH14XSOtg5rfp6raBskBHl7wnmy3F/biAIyY5MsK+BHWhoPhxtZ1Y2R7OHHaWyQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "abort-error": "^1.0.1" + } + }, + "node_modules/race-signal": { + "version": "2.0.0", + "resolved": "https://registry.npmjs.org/race-signal/-/race-signal-2.0.0.tgz", + "integrity": "sha512-P31bLhE4ByBX/70QDXMutxnqgwrF1WUXea1O8DXuviAgkdbQ1iQMQotNgzJIBC9yUSn08u/acZrMUhgw7w6GpA==", + "license": "Apache-2.0 OR MIT" + }, + "node_modules/random-int": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/random-int/-/random-int-3.1.0.tgz", + "integrity": "sha512-h8CRz8cpvzj0hC/iH/1Gapgcl2TQ6xtnCpyOI5WvWfXf/yrDx2DOU+tD9rX23j36IF11xg1KqB9W11Z18JPMdw==", + "license": "MIT", + "engines": { + "node": ">=12" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/redis": { + "version": "5.10.0", + "resolved": "https://registry.npmjs.org/redis/-/redis-5.10.0.tgz", + "integrity": "sha512-0/Y+7IEiTgVGPrLFKy8oAEArSyEJkU0zvgV5xyi9NzNQ+SLZmyFbUsWIbgPcd4UdUh00opXGKlXJwMmsis5Byw==", + "license": "MIT", + "dependencies": { + "@redis/bloom": "5.10.0", + "@redis/client": "5.10.0", + "@redis/json": "5.10.0", + "@redis/search": "5.10.0", + "@redis/time-series": "5.10.0" + }, + "engines": { + "node": ">= 18" + } + }, + "node_modules/supports-color": { + "version": "10.2.2", + "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-10.2.2.tgz", + "integrity": "sha512-SS+jx45GF1QjgEXQx4NJZV9ImqmO2NPz5FNsIHrsDjh2YsHnawpan7SNQ1o8NuhrbHZy9AZhIoCUiCeaW/C80g==", + "license": "MIT", + "engines": { + "node": ">=18" + }, + "funding": { + "url": "https://github.com/chalk/supports-color?sponsor=1" + } + }, + "node_modules/uint8-varint": { + "version": "2.0.4", + "resolved": "https://registry.npmjs.org/uint8-varint/-/uint8-varint-2.0.4.tgz", + "integrity": "sha512-FwpTa7ZGA/f/EssWAb5/YV6pHgVF1fViKdW8cWaEarjB8t7NyofSWBdOTyFPaGuUG4gx3v1O3PQ8etsiOs3lcw==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "uint8arraylist": "^2.0.0", + "uint8arrays": "^5.0.0" + } + }, + "node_modules/uint8arraylist": { + "version": "2.4.8", + "resolved": "https://registry.npmjs.org/uint8arraylist/-/uint8arraylist-2.4.8.tgz", + "integrity": "sha512-vc1PlGOzglLF0eae1M8mLRTBivsvrGsdmJ5RbK3e+QRvRLOZfZhQROTwH/OfyF3+ZVUg9/8hE8bmKP2CvP9quQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "uint8arrays": "^5.0.1" + } + }, + "node_modules/uint8arrays": { + "version": "5.1.0", + "resolved": "https://registry.npmjs.org/uint8arrays/-/uint8arrays-5.1.0.tgz", + "integrity": "sha512-vA6nFepEmlSKkMBnLBaUMVvAC4G3CTmO58C12y4sq6WPDOR7mOFYOi7GlrQ4djeSbP6JG9Pv9tJDM97PedRSww==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "multiformats": "^13.0.0" + } + }, + "node_modules/unlimited-timeout": { + "version": "0.1.0", + "resolved": "https://registry.npmjs.org/unlimited-timeout/-/unlimited-timeout-0.1.0.tgz", + "integrity": "sha512-D4g+mxFeQGQHzCfnvij+R35ukJ0658Zzudw7j16p4tBBbNasKkKM4SocYxqhwT5xA7a9JYWDzKkEFyMlRi5sng==", + "license": "MIT", + "engines": { + "node": ">=20" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/utf8-codec": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/utf8-codec/-/utf8-codec-1.0.0.tgz", + "integrity": "sha512-S/QSLezp3qvG4ld5PUfXiH7mCFxLKjSVZRFkB3DOjgwHuJPFDkInAXc/anf7BAbHt/D38ozDzL+QMZ6/7gsI6w==", + "license": "MIT" + }, + "node_modules/weald": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/weald/-/weald-1.1.1.tgz", + "integrity": "sha512-PaEQShzMCz8J/AD2N3dJMc1hTZWkJeLKS2NMeiVkV5KDHwgZe7qXLEzyodsT/SODxWDdXJJqocuwf3kHzcXhSQ==", + "license": "Apache-2.0 OR MIT", + "dependencies": { + "ms": "^3.0.0-canary.1", + "supports-color": "^10.0.0" + } + } + } +} diff --git a/echo/images/js-libp2p/v1.x/package.json b/echo/images/js-libp2p/v1.x/package.json new file mode 100644 index 000000000..39bc6210f --- /dev/null +++ b/echo/images/js-libp2p/v1.x/package.json @@ -0,0 +1,21 @@ +{ + "name": "js-libp2p-echo-server", + "version": "1.0.0", + "description": "JS-libp2p Echo Server for interoperability tests", + "main": "src/index.js", + "type": "module", + "scripts": { + "start": "node src/index.js" + }, + "keywords": [ + "libp2p", + "interop", + "echo" + ], + "author": "libp2p team", + "license": "MIT", + "dependencies": { + "libp2p": "^3.1.3", + "redis": "^5.10.0" + } +} diff --git a/echo/images/js-libp2p/v1.x/src/index.js b/echo/images/js-libp2p/v1.x/src/index.js new file mode 100644 index 000000000..cadae3ae9 --- /dev/null +++ b/echo/images/js-libp2p/v1.x/src/index.js @@ -0,0 +1,133 @@ +#!/usr/bin/env node + +/** + * Simple JS-libp2p Echo Server for interoperability tests + */ + +import { createLibp2p } from 'libp2p' +import { createClient } from 'redis' + +// Echo protocol ID +const ECHO_PROTOCOL = '/echo/1.0.0' + +// Environment configuration +const config = { + redisAddr: process.env.REDIS_ADDR || 'redis://localhost:6379', + port: parseInt(process.env.PORT || '0', 10), + host: process.env.HOST || '0.0.0.0' +} + +/** + * Echo protocol handler - pipes the stream back to the source + */ +async function handleEchoProtocol({ stream }) { + try { + console.error('Handling echo protocol request') + + // Read data from stream + const chunks = [] + for await (const chunk of stream.source) { + chunks.push(chunk) + } + + // Echo back the data + const data = new Uint8Array(chunks.reduce((acc, chunk) => acc + chunk.length, 0)) + let offset = 0 + for (const chunk of chunks) { + data.set(chunk, offset) + offset += chunk.length + } + + // Write back to stream + await stream.sink([data]) + + console.error(`Echoed ${data.length} bytes`) + } catch (error) { + console.error(`Echo protocol error: ${error.message}`) + } +} + +/** + * Publish multiaddr to Redis for coordination + */ +async function publishMultiaddr(multiaddr) { + let redisClient = null + + try { + redisClient = createClient({ url: config.redisAddr }) + await redisClient.connect() + + const key = 'js-echo-server-multiaddr' + await redisClient.rPush(key, multiaddr) + await redisClient.expire(key, 300) + + console.error(`Published multiaddr to Redis: ${multiaddr}`) + + } catch (error) { + console.error(`Redis error: ${error.message}`) + } finally { + if (redisClient) { + try { + await redisClient.quit() + } catch (error) { + console.error(`Redis cleanup error: ${error.message}`) + } + } + } +} + +/** + * Main server function + */ +async function main() { + try { + // Create libp2p node with default configuration + const node = await createLibp2p({ + addresses: { + listen: [`/ip4/${config.host}/tcp/${config.port}`] + } + }) + + // Handle echo protocol + await node.handle(ECHO_PROTOCOL, handleEchoProtocol) + + // Start the node + await node.start() + + const multiaddrs = node.getMultiaddrs() + if (multiaddrs.length === 0) { + throw new Error('No listening addresses found') + } + + const multiaddr = multiaddrs[0].toString() + console.log(multiaddr) // Output to stdout for test coordination + + await publishMultiaddr(multiaddr) + + console.error('Echo server started successfully') + + // Graceful shutdown + const handleShutdown = async (signal) => { + console.error(`Received ${signal}, shutting down...`) + await node.stop() + process.exit(0) + } + + process.on('SIGINT', () => handleShutdown('SIGINT')) + process.on('SIGTERM', () => handleShutdown('SIGTERM')) + + // Keep running + await new Promise(() => {}) + + } catch (error) { + console.error(`Server error: ${error.message}`) + console.error(error.stack) + process.exit(1) + } +} + +main().catch((error) => { + console.error(`Fatal error: ${error.message}`) + console.error(error.stack) + process.exit(1) +}) \ No newline at end of file diff --git a/echo/images/py-libp2p/v0.x/Dockerfile b/echo/images/py-libp2p/v0.x/Dockerfile new file mode 100644 index 000000000..a52ca3ebd --- /dev/null +++ b/echo/images/py-libp2p/v0.x/Dockerfile @@ -0,0 +1,26 @@ +FROM python:3.11-slim + +# Install system dependencies +RUN apt-get update && apt-get install -y \ + gcc \ + libgmp-dev \ + && rm -rf /var/lib/apt/lists/* + +# Create app directory +WORKDIR /app + +# Copy requirements +COPY requirements.txt . + +# Install Python dependencies +RUN pip install --no-cache-dir -r requirements.txt + +# Copy source code +COPY main.py . + +# Create non-root user +RUN useradd -m -u 1001 testuser +USER testuser + +# Run the test +CMD ["python", "main.py"] \ No newline at end of file diff --git a/echo/images/py-libp2p/v0.x/main.py b/echo/images/py-libp2p/v0.x/main.py new file mode 100644 index 000000000..109508d67 --- /dev/null +++ b/echo/images/py-libp2p/v0.x/main.py @@ -0,0 +1,124 @@ +#!/usr/bin/env python3 + +""" +Python test harness for js-libp2p Echo protocol interoperability tests +""" + +import os +import sys +import json +import time +import trio +import redis +from libp2p import new_host + +# Configuration from environment +TRANSPORT = os.getenv('TRANSPORT', 'tcp') +SECURITY = os.getenv('SECURITY', 'noise') +MUXER = os.getenv('MUXER', 'yamux') +REDIS_ADDR = os.getenv('REDIS_ADDR', 'redis://localhost:6379') +ECHO_PROTOCOL = '/echo/1.0.0' + +async def get_server_multiaddr(): + """Get JS echo server multiaddr from Redis""" + try: + r = redis.from_url(REDIS_ADDR) + + # Wait for server to publish multiaddr + for _ in range(30): # 30 second timeout + multiaddrs = r.lrange('js-echo-server-multiaddr', 0, -1) + if multiaddrs: + return multiaddrs[-1].decode('utf-8') + await trio.sleep(1) + + raise Exception("Timeout waiting for server multiaddr") + + except Exception as e: + print(f"Failed to get server multiaddr: {e}", file=sys.stderr) + raise + +async def echo_test(multiaddr: str, test_data: bytes): + """Perform echo test with the JS server""" + try: + # Create libp2p host + host = new_host() + await host.get_network().listen([]) + + # Parse multiaddr and connect to server + info = host.get_network().multiaddr_to_peer_info(multiaddr) + await host.connect(info) + + # Open echo protocol stream + stream = await host.new_stream(info.peer_id, [ECHO_PROTOCOL]) + + # Send test data + await stream.write(test_data) + await stream.close() + + # Read response + response = await stream.read() + + # Verify echo + if response == test_data: + return {"status": "passed", "data_length": len(test_data)} + else: + return {"status": "failed", "error": "Echo mismatch"} + + except Exception as e: + print(f"Echo test failed: {e}", file=sys.stderr) + return {"status": "failed", "error": str(e)} + +async def main(): + """Main test function""" + start_time = time.time() + + try: + # Get server multiaddr + multiaddr = await get_server_multiaddr() + print(f"Got server multiaddr: {multiaddr}", file=sys.stderr) + + # Test cases + test_cases = [ + b"Hello, Echo!", + b"\x00\x01\x02\x03\x04", # Binary data + b"A" * 1024, # Larger payload + ] + + results = [] + for i, test_data in enumerate(test_cases): + print(f"Running test case {i+1}", file=sys.stderr) + result = await echo_test(multiaddr, test_data) + results.append(result) + + # Output results as JSON to stdout + output = { + "test": "echo-protocol", + "transport": TRANSPORT, + "security": SECURITY, + "muxer": MUXER, + "duration": time.time() - start_time, + "results": results, + "passed": all(r["status"] == "passed" for r in results) + } + + print(json.dumps(output)) + + # Exit with appropriate code + sys.exit(0 if output["passed"] else 1) + + except Exception as e: + print(f"Test failed: {e}", file=sys.stderr) + output = { + "test": "echo-protocol", + "transport": TRANSPORT, + "security": SECURITY, + "muxer": MUXER, + "duration": time.time() - start_time, + "error": str(e), + "passed": False + } + print(json.dumps(output)) + sys.exit(1) + +if __name__ == "__main__": + trio.run(main) \ No newline at end of file diff --git a/echo/images/py-libp2p/v0.x/requirements.txt b/echo/images/py-libp2p/v0.x/requirements.txt new file mode 100644 index 000000000..751618940 --- /dev/null +++ b/echo/images/py-libp2p/v0.x/requirements.txt @@ -0,0 +1,3 @@ +libp2p==0.5.0 +trio>=0.26.0 +redis==4.5.4 \ No newline at end of file diff --git a/echo/lib/generate-dashboard.sh b/echo/lib/generate-dashboard.sh new file mode 100755 index 000000000..91b4df22b --- /dev/null +++ b/echo/lib/generate-dashboard.sh @@ -0,0 +1,234 @@ +#!/bin/bash +# Generate HTML dashboard for Echo protocol test results +# Usage: generate-dashboard.sh + +set -euo pipefail + +##### 1. SETUP + +RESULTS_DIR="${1:-results}" +DASHBOARD_FILE="${RESULTS_DIR}/echo-dashboard.html" + +if [[ ! -d "${RESULTS_DIR}" ]]; then + echo "ERROR: Results directory not found: ${RESULTS_DIR}" >&2 + exit 1 +fi + +# Source common libraries +source "${SCRIPT_LIB_DIR}/lib-output-formatting.sh" + +##### 2. COLLECT RESULTS + +print_message "Generating Echo protocol dashboard..." + +# Find all result files +RESULT_FILES=($(find "${RESULTS_DIR}" -name "*.json" -type f)) + +if [[ ${#RESULT_FILES[@]} -eq 0 ]]; then + echo "ERROR: No result files found in ${RESULTS_DIR}" >&2 + exit 1 +fi + +# Parse results +TOTAL_TESTS=0 +PASSED_TESTS=0 +FAILED_TESTS=0 +declare -A RESULTS_BY_COMBO + +for result_file in "${RESULT_FILES[@]}"; do + if [[ -f "${result_file}" ]]; then + TEST_NAME=$(jq -r '.test' "${result_file}" 2>/dev/null || echo "unknown") + RESULT=$(jq -r '.result' "${result_file}" 2>/dev/null || echo "FAIL") + SERVER=$(jq -r '.server' "${result_file}" 2>/dev/null || echo "unknown") + CLIENT=$(jq -r '.client' "${result_file}" 2>/dev/null || echo "unknown") + TRANSPORT=$(jq -r '.transport' "${result_file}" 2>/dev/null || echo "unknown") + SECURE=$(jq -r '.secureChannel' "${result_file}" 2>/dev/null || echo "unknown") + MUXER=$(jq -r '.muxer' "${result_file}" 2>/dev/null || echo "unknown") + + COMBO_KEY="${SERVER}|${CLIENT}|${TRANSPORT}|${SECURE}|${MUXER}" + RESULTS_BY_COMBO["${COMBO_KEY}"]="${RESULT}" + + ((TOTAL_TESTS++)) + if [[ "${RESULT}" == "PASS" ]]; then + ((PASSED_TESTS++)) + else + ((FAILED_TESTS++)) + fi + fi +done + +##### 3. GENERATE HTML DASHBOARD + +cat > "${DASHBOARD_FILE}" << 'EOF' + + + + + + Echo Protocol Interoperability Test Results + + + +
+

🔄 Echo Protocol Interoperability Test Results

+ +
+
+
TOTAL_TESTS_PLACEHOLDER
+
Total Tests
+
+
+
PASSED_TESTS_PLACEHOLDER
+
Passed
+
+
+
FAILED_TESTS_PLACEHOLDER
+
Failed
+
+
+ + + + + + + + + + + + + + + RESULTS_TABLE_PLACEHOLDER + +
ServerClientTransportSecurityMuxerProtocolResult
+ + +
+ + +EOF + +# Replace placeholders +sed -i.bak "s/TOTAL_TESTS_PLACEHOLDER/${TOTAL_TESTS}/g" "${DASHBOARD_FILE}" +sed -i.bak "s/PASSED_TESTS_PLACEHOLDER/${PASSED_TESTS}/g" "${DASHBOARD_FILE}" +sed -i.bak "s/FAILED_TESTS_PLACEHOLDER/${FAILED_TESTS}/g" "${DASHBOARD_FILE}" +sed -i.bak "s/TIMESTAMP_PLACEHOLDER/$(date -u +"%Y-%m-%d %H:%M:%S UTC")/g" "${DASHBOARD_FILE}" + +# Generate results table +TABLE_ROWS="" +for combo_key in "${!RESULTS_BY_COMBO[@]}"; do + IFS='|' read -r server client transport secure muxer <<< "${combo_key}" + result="${RESULTS_BY_COMBO[${combo_key}]}" + + if [[ "${result}" == "PASS" ]]; then + result_class="result-pass" + result_symbol="✅ PASS" + else + result_class="result-fail" + result_symbol="❌ FAIL" + fi + + TABLE_ROWS+="" + TABLE_ROWS+="${server}" + TABLE_ROWS+="${client}" + TABLE_ROWS+="${transport}" + TABLE_ROWS+="${secure}" + TABLE_ROWS+="${muxer}" + TABLE_ROWS+="/echo/1.0.0" + TABLE_ROWS+="${result_symbol}" + TABLE_ROWS+="" +done + +sed -i.bak "s|RESULTS_TABLE_PLACEHOLDER|${TABLE_ROWS}|g" "${DASHBOARD_FILE}" + +# Clean up backup file +rm -f "${DASHBOARD_FILE}.bak" + +##### 4. RESULTS + +print_message "Dashboard generated: ${DASHBOARD_FILE}" +print_message "Summary: ${PASSED_TESTS}/${TOTAL_TESTS} tests passed" + +if [[ "${FAILED_TESTS}" -gt 0 ]]; then + print_message "⚠️ ${FAILED_TESTS} tests failed - check dashboard for details" +fi \ No newline at end of file diff --git a/echo/lib/generate-tests.sh b/echo/lib/generate-tests.sh new file mode 100755 index 000000000..f51ce332c --- /dev/null +++ b/echo/lib/generate-tests.sh @@ -0,0 +1,210 @@ +#!/bin/bash +# Generate test matrix for Echo protocol interoperability tests +# Outputs test-matrix.yaml with content-addressed caching +# Permutations: js-server × py-client × transport × secureChannel × muxer + +##### 1. SETUP + +set -euo pipefail + +trap 'echo "ERROR in generate-tests.sh at line $LINENO: Command exited with status $?" >&2' ERR + +# Source common libraries +source "${SCRIPT_LIB_DIR}/lib-filter-engine.sh" +source "${SCRIPT_LIB_DIR}/lib-generate-tests.sh" +source "${SCRIPT_LIB_DIR}/lib-image-building.sh" +source "${SCRIPT_LIB_DIR}/lib-image-naming.sh" +source "${SCRIPT_LIB_DIR}/lib-output-formatting.sh" +source "${SCRIPT_LIB_DIR}/lib-test-caching.sh" +source "${SCRIPT_LIB_DIR}/lib-test-filtering.sh" +source "${SCRIPT_LIB_DIR}/lib-test-images.sh" + +##### 2. FILTER EXPANSION + +# Load test aliases +load_aliases + +# Get common entity IDs for negation expansion and ignored test generation +readarray -t all_image_ids < <(get_entity_ids "implementations") + +# All transport names +readarray -t all_transport_names < <(get_transport_names "implementations") + +# All secure channel names +readarray -t all_secure_names < <(get_secure_names "implementations") + +# All muxer names +readarray -t all_muxer_names < <(get_muxer_names "implementations") + +# Save original filters for display +ORIGINAL_IMPL_SELECT="${IMPL_SELECT}" +ORIGINAL_IMPL_IGNORE="${IMPL_IGNORE}" +ORIGINAL_TRANSPORT_SELECT="${TRANSPORT_SELECT}" +ORIGINAL_TRANSPORT_IGNORE="${TRANSPORT_IGNORE}" +ORIGINAL_SECURE_SELECT="${SECURE_SELECT}" +ORIGINAL_SECURE_IGNORE="${SECURE_IGNORE}" +ORIGINAL_MUXER_SELECT="${MUXER_SELECT}" +ORIGINAL_MUXER_IGNORE="${MUXER_IGNORE}" +ORIGINAL_TEST_SELECT="${TEST_SELECT}" +ORIGINAL_TEST_IGNORE="${TEST_IGNORE}" + +# Expand filter strings +IMPL_SELECT=$(expand_filter_string "${IMPL_SELECT}" all_image_ids) +IMPL_IGNORE=$(expand_filter_string "${IMPL_IGNORE}" all_image_ids) +TRANSPORT_SELECT=$(expand_filter_string "${TRANSPORT_SELECT}" all_transport_names) +TRANSPORT_IGNORE=$(expand_filter_string "${TRANSPORT_IGNORE}" all_transport_names) +SECURE_SELECT=$(expand_filter_string "${SECURE_SELECT}" all_secure_names) +SECURE_IGNORE=$(expand_filter_string "${SECURE_IGNORE}" all_secure_names) +MUXER_SELECT=$(expand_filter_string "${MUXER_SELECT}" all_muxer_names) +MUXER_IGNORE=$(expand_filter_string "${MUXER_IGNORE}" all_muxer_names) + +##### 3. CACHE MANAGEMENT + +# Compute cache key for test matrix +CACHE_KEY=$(compute_test_cache_key) +TEST_MATRIX_FILE="${CACHE_DIR}/test-matrix/echo-${CACHE_KEY}.yaml" + +# Check if cached test matrix exists +if [ -f "${TEST_MATRIX_FILE}" ]; then + print_message "Using cached test matrix: ${TEST_MATRIX_FILE}" + exit 0 +fi + +##### 4. GENERATE TEST MATRIX + +print_message "Generating Echo protocol test matrix..." + +# Create cache directory +mkdir -p "$(dirname "${TEST_MATRIX_FILE}")" + +# Generate test combinations +{ + echo "# Echo Protocol Interoperability Test Matrix" + echo "# Generated: $(date -u +"%Y-%m-%d %H:%M:%S UTC")" + echo "# Cache Key: ${CACHE_KEY}" + echo "" + echo "tests:" + + # Get filtered implementations + local js_servers=() + local py_clients=() + + while IFS= read -r impl_id; do + if [[ "${impl_id}" == *"js-libp2p"* ]]; then + js_servers+=("${impl_id}") + elif [[ "${impl_id}" == *"py-libp2p"* ]]; then + py_clients+=("${impl_id}") + fi + done < <(filter_names "${IMPL_SELECT}" "${IMPL_IGNORE}" all_image_ids) + + # Generate test combinations + local test_count=0 + for server in "${js_servers[@]}"; do + for client in "${py_clients[@]}"; do + # Get supported protocols for this combination + local server_transports=($(get_implementation_transports "${server}")) + local client_transports=($(get_implementation_transports "${client}")) + local common_transports=($(get_common_elements server_transports client_transports)) + + local server_secures=($(get_implementation_secures "${server}")) + local client_secures=($(get_implementation_secures "${client}")) + local common_secures=($(get_common_elements server_secures client_secures)) + + local server_muxers=($(get_implementation_muxers "${server}")) + local client_muxers=($(get_implementation_muxers "${client}")) + local common_muxers=($(get_common_elements server_muxers client_muxers)) + + # Filter protocols + local filtered_transports=($(filter_names "${TRANSPORT_SELECT}" "${TRANSPORT_IGNORE}" common_transports)) + local filtered_secures=($(filter_names "${SECURE_SELECT}" "${SECURE_IGNORE}" common_secures)) + local filtered_muxers=($(filter_names "${MUXER_SELECT}" "${MUXER_IGNORE}" common_muxers)) + + # Generate test for each combination + for transport in "${filtered_transports[@]}"; do + for secure in "${filtered_secures[@]}"; do + for muxer in "${filtered_muxers[@]}"; do + local test_name="echo-${server}-${client}-${transport}-${secure}-${muxer}" + + # Apply test-level filtering + if should_include_test "${test_name}" "${TEST_SELECT}" "${TEST_IGNORE}"; then + cat << EOF + - name: "${test_name}" + server: "${server}" + client: "${client}" + transport: "${transport}" + secureChannel: "${secure}" + muxer: "${muxer}" + protocol: "/echo/1.0.0" + timeout: 300 +EOF + ((test_count++)) + fi + done + done + done + done + done + + echo "" + echo "# Total tests: ${test_count}" + +} > "${TEST_MATRIX_FILE}" + +print_message "Generated ${test_count} Echo protocol tests" +print_message "Test matrix saved: ${TEST_MATRIX_FILE}" + +##### 5. HELPER FUNCTIONS + +get_implementation_transports() { + local impl_id="$1" + yq eval ".implementations[] | select(.id == \"${impl_id}\") | .transports[]" "${IMAGES_YAML}" +} + +get_implementation_secures() { + local impl_id="$1" + yq eval ".implementations[] | select(.id == \"${impl_id}\") | .secureChannels[]" "${IMAGES_YAML}" +} + +get_implementation_muxers() { + local impl_id="$1" + yq eval ".implementations[] | select(.id == \"${impl_id}\") | .muxers[]" "${IMAGES_YAML}" +} + +get_common_elements() { + local -n arr1=$1 + local -n arr2=$2 + local common=() + + for elem1 in "${arr1[@]}"; do + for elem2 in "${arr2[@]}"; do + if [[ "${elem1}" == "${elem2}" ]]; then + common+=("${elem1}") + break + fi + done + done + + printf '%s\n' "${common[@]}" +} + +should_include_test() { + local test_name="$1" + local select_filter="$2" + local ignore_filter="$3" + + # Apply test selection filter + if [[ -n "${select_filter}" ]]; then + if ! [[ "${test_name}" =~ ${select_filter} ]]; then + return 1 + fi + fi + + # Apply test ignore filter + if [[ -n "${ignore_filter}" ]]; then + if [[ "${test_name}" =~ ${ignore_filter} ]]; then + return 1 + fi + fi + + return 0 +} \ No newline at end of file diff --git a/echo/lib/run-single-test.sh b/echo/lib/run-single-test.sh new file mode 100755 index 000000000..a3c904d5b --- /dev/null +++ b/echo/lib/run-single-test.sh @@ -0,0 +1,156 @@ +#!/bin/bash +# Run a single Echo protocol interoperability test +# Usage: run-single-test.sh + +set -euo pipefail + +##### 1. SETUP + +TEST_NAME="${1:-}" +if [[ -z "${TEST_NAME}" ]]; then + echo "ERROR: Test name required" >&2 + echo "Usage: $0 " >&2 + exit 1 +fi + +# Source common libraries +source "${SCRIPT_LIB_DIR}/lib-output-formatting.sh" +source "${SCRIPT_LIB_DIR}/lib-test-caching.sh" + +##### 2. LOAD TEST DEFINITION + +CACHE_KEY=$(compute_test_cache_key) +TEST_MATRIX_FILE="${CACHE_DIR}/test-matrix/echo-${CACHE_KEY}.yaml" + +if [[ ! -f "${TEST_MATRIX_FILE}" ]]; then + echo "ERROR: Test matrix not found: ${TEST_MATRIX_FILE}" >&2 + echo "Run generate-tests.sh first" >&2 + exit 1 +fi + +# Extract test definition +TEST_DEF=$(yq eval ".tests[] | select(.name == \"${TEST_NAME}\")" "${TEST_MATRIX_FILE}") +if [[ -z "${TEST_DEF}" ]]; then + echo "ERROR: Test not found: ${TEST_NAME}" >&2 + exit 1 +fi + +# Parse test parameters +SERVER_ID=$(echo "${TEST_DEF}" | yq eval '.server' -) +CLIENT_ID=$(echo "${TEST_DEF}" | yq eval '.client' -) +TRANSPORT=$(echo "${TEST_DEF}" | yq eval '.transport' -) +SECURE_CHANNEL=$(echo "${TEST_DEF}" | yq eval '.secureChannel' -) +MUXER=$(echo "${TEST_DEF}" | yq eval '.muxer' -) +PROTOCOL=$(echo "${TEST_DEF}" | yq eval '.protocol' -) +TIMEOUT=$(echo "${TEST_DEF}" | yq eval '.timeout' -) + +##### 3. DOCKER COMPOSE SETUP + +COMPOSE_FILE="${CACHE_DIR}/test-docker-compose/echo-${CACHE_KEY}-${TEST_NAME}.yaml" + +if [[ ! -f "${COMPOSE_FILE}" ]]; then + echo "ERROR: Docker compose file not found: ${COMPOSE_FILE}" >&2 + echo "Run generate-tests.sh first" >&2 + exit 1 +fi + +##### 4. RUN TEST + +print_message "Running Echo test: ${TEST_NAME}" +print_message " Server: ${SERVER_ID}" +print_message " Client: ${CLIENT_ID}" +print_message " Transport: ${TRANSPORT}" +print_message " Security: ${SECURE_CHANNEL}" +print_message " Muxer: ${MUXER}" +print_message " Protocol: ${PROTOCOL}" + +# Create test network +NETWORK_NAME="echo-test-${TEST_NAME}" +docker network create "${NETWORK_NAME}" 2>/dev/null || true + +# Start Redis coordination service +print_message "Starting Redis coordination service..." +REDIS_CONTAINER="redis-${TEST_NAME}" +docker run -d --name "${REDIS_CONTAINER}" \ + --network "${NETWORK_NAME}" \ + redis:alpine >/dev/null + +# Wait for Redis to be ready +sleep 2 + +# Start Echo server +print_message "Starting Echo server (${SERVER_ID})..." +SERVER_CONTAINER="server-${TEST_NAME}" +docker run -d --name "${SERVER_CONTAINER}" \ + --network "${NETWORK_NAME}" \ + -e REDIS_ADDR="redis://${REDIS_CONTAINER}:6379" \ + -e TRANSPORT="${TRANSPORT}" \ + -e SECURITY="${SECURE_CHANNEL}" \ + -e MUXER="${MUXER}" \ + "${SERVER_ID}" >/dev/null + +# Wait for server to start and publish multiaddr +print_message "Waiting for server to start..." +sleep 5 + +# Run Echo client test +print_message "Running Echo client test (${CLIENT_ID})..." +CLIENT_CONTAINER="client-${TEST_NAME}" + +# Run client and capture output +if docker run --name "${CLIENT_CONTAINER}" \ + --network "${NETWORK_NAME}" \ + -e REDIS_ADDR="redis://${REDIS_CONTAINER}:6379" \ + -e TRANSPORT="${TRANSPORT}" \ + -e SECURITY="${SECURE_CHANNEL}" \ + -e MUXER="${MUXER}" \ + -e TIMEOUT="${TIMEOUT}" \ + "${CLIENT_ID}" 2>/dev/null; then + + TEST_RESULT="PASS" + print_message "✅ Test PASSED: ${TEST_NAME}" +else + TEST_RESULT="FAIL" + print_message "❌ Test FAILED: ${TEST_NAME}" + + # Show container logs for debugging + echo "=== Server logs ===" >&2 + docker logs "${SERVER_CONTAINER}" 2>&1 | tail -20 >&2 + echo "=== Client logs ===" >&2 + docker logs "${CLIENT_CONTAINER}" 2>&1 | tail -20 >&2 +fi + +##### 5. CLEANUP + +print_message "Cleaning up test containers..." + +# Stop and remove containers +docker stop "${REDIS_CONTAINER}" "${SERVER_CONTAINER}" 2>/dev/null || true +docker rm "${REDIS_CONTAINER}" "${SERVER_CONTAINER}" "${CLIENT_CONTAINER}" 2>/dev/null || true + +# Remove network +docker network rm "${NETWORK_NAME}" 2>/dev/null || true + +##### 6. RESULTS + +# Output test result in structured format +cat << EOF +{ + "test": "${TEST_NAME}", + "server": "${SERVER_ID}", + "client": "${CLIENT_ID}", + "transport": "${TRANSPORT}", + "secureChannel": "${SECURE_CHANNEL}", + "muxer": "${MUXER}", + "protocol": "${PROTOCOL}", + "result": "${TEST_RESULT}", + "timestamp": "$(date -u +"%Y-%m-%dT%H:%M:%SZ")" +} +EOF + +# Exit with appropriate code +if [[ "${TEST_RESULT}" == "PASS" ]]; then + exit 0 +else + exit 1 +fi \ No newline at end of file diff --git a/echo/run.sh b/echo/run.sh new file mode 100755 index 000000000..6107d57aa --- /dev/null +++ b/echo/run.sh @@ -0,0 +1,338 @@ +#!/bin/bash + +# run in strict failure mode +set -euo pipefail + +# ╔═══╗ ╔═══╗ ╔╗ ╔╗ ╔═══╗ +# ▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁ ║╔══╝ ║╔═╗║ ║║ ║║ ║╔═╗║ ▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁ +# ═══════════════════════════════ ║╚══╗ ║║ ║║ ║╚══╝║ ║║ ║║ ═════════════════════════════════ +# ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔ ╚═══╝ ╚╝ ╚╝ ╚════╝ ╚╝ ╚╝ ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔ + +# ============================================================================= +# STEP 1: BOOTSTRAP: Load inputs.yaml BEFORE setting SCRIPT_LIB_DIR +# ----------------------------------------------------------------------------- +# This allows for the re-creation of the environment and command line arguments +# from a previous test run. If the inputs.yaml file doesn't exist, then this +# script will run from a default environment. Any command line arguments passed +# will override any command line arguments loaded from inputs.yaml, however the +# environment variables in inputs.yaml will override the ones initialized in +# the shell executing this script. +# ============================================================================= + +# Capture original arguments for inputs.yaml generation +ORIGINAL_ARGS=("$@") + +# Change to script directory +cd "$(dirname "$0")" + +# Loads and exports the environment variables from the inputs yaml file +load_inputs_yaml_inline() { + local inputs_file="${1:-inputs.yaml}" + + # Look for the inputs file if it exists + if [ ! -f "${inputs_file}" ]; then + return 1 + fi + + echo "→ Loading configuration from ${inputs_file}" + + # Load and export the environment variables from the inputs file + while IFS='=' read -r key value; do + if [ -n "${key}" ] && [ -n "${value}" ]; then + export "${key}"="${value}" + fi + done < <(yq eval '.environmentVariables | to_entries | .[] | .key + "=" + .value' "${inputs_file}" 2>/dev/null) + + return 0 +} + +# Loads the command line arguments from the inputs yaml file +load_inputs_yaml_args_inline() { + local inputs_file="${1:-inputs.yaml}" + + # Look for the inputs file if it exists + if [ ! -f "${inputs_file}" ]; then + return 1 + fi + + # Load the command line arguments from the inputs file + readarray -t LOADED_ARGS < <(yq eval '.commandLineArguments[]' "${inputs_file}" 2>/dev/null) + + return 0 +} + +# Try to load inputs.yaml if it exists +LOADED_ARGS=() +if load_inputs_yaml_inline "inputs.yaml"; then + load_inputs_yaml_args_inline "inputs.yaml" +fi + +# ============================================================================= +# STEP 2: ENVIRONMENT SETUP +# ============================================================================= + +# Set up paths +export TEST_ROOT="$(pwd)" +export SCRIPT_LIB_DIR="${TEST_ROOT}/../lib" +export CACHE_DIR="${CACHE_DIR:-/srv/cache}" +export IMAGES_YAML="${TEST_ROOT}/images.yaml" + +# Test configuration +export TEST_TYPE="echo" +export WORKERS="${WORKERS:-$(nproc)}" +export DEBUG="${DEBUG:-false}" + +# Filter configuration +export IMPL_SELECT="${IMPL_SELECT:-}" +export IMPL_IGNORE="${IMPL_IGNORE:-}" +export TRANSPORT_SELECT="${TRANSPORT_SELECT:-}" +export TRANSPORT_IGNORE="${TRANSPORT_IGNORE:-}" +export SECURE_SELECT="${SECURE_SELECT:-}" +export SECURE_IGNORE="${SECURE_IGNORE:-}" +export MUXER_SELECT="${MUXER_SELECT:-}" +export MUXER_IGNORE="${MUXER_IGNORE:-}" +export TEST_SELECT="${TEST_SELECT:-}" +export TEST_IGNORE="${TEST_IGNORE:-}" + +# ============================================================================= +# STEP 3: ARGUMENT PARSING +# ============================================================================= + +# Merge loaded args with command line args (command line takes precedence) +ALL_ARGS=("${LOADED_ARGS[@]}" "$@") + +# Parse arguments +while [[ $# -gt 0 ]]; do + case $1 in + --impl-select) + IMPL_SELECT="$2" + shift 2 + ;; + --impl-ignore) + IMPL_IGNORE="$2" + shift 2 + ;; + --transport-select) + TRANSPORT_SELECT="$2" + shift 2 + ;; + --transport-ignore) + TRANSPORT_IGNORE="$2" + shift 2 + ;; + --secure-select) + SECURE_SELECT="$2" + shift 2 + ;; + --secure-ignore) + SECURE_IGNORE="$2" + shift 2 + ;; + --muxer-select) + MUXER_SELECT="$2" + shift 2 + ;; + --muxer-ignore) + MUXER_IGNORE="$2" + shift 2 + ;; + --test-select) + TEST_SELECT="$2" + shift 2 + ;; + --test-ignore) + TEST_IGNORE="$2" + shift 2 + ;; + --workers) + WORKERS="$2" + shift 2 + ;; + --debug) + DEBUG="true" + shift + ;; + --check-deps) + exec "${SCRIPT_LIB_DIR}/check-dependencies.sh" + ;; + --help|-h) + echo "Echo Protocol Interoperability Tests" + echo "" + echo "Usage: $0 [options]" + echo "" + echo "Filtering Options:" + echo " --impl-select FILTER Select implementations (e.g., 'js-libp2p')" + echo " --impl-ignore FILTER Ignore implementations (e.g., '!py-libp2p')" + echo " --transport-select FILTER Select transports (e.g., 'tcp')" + echo " --transport-ignore FILTER Ignore transports" + echo " --secure-select FILTER Select secure channels (e.g., 'noise')" + echo " --secure-ignore FILTER Ignore secure channels" + echo " --muxer-select FILTER Select muxers (e.g., 'yamux')" + echo " --muxer-ignore FILTER Ignore muxers" + echo " --test-select FILTER Select specific tests" + echo " --test-ignore FILTER Ignore specific tests" + echo "" + echo "Execution Options:" + echo " --workers N Number of parallel workers (default: $(nproc))" + echo " --debug Enable debug output" + echo "" + echo "Utility Options:" + echo " --check-deps Check system dependencies" + echo " --help, -h Show this help" + echo "" + echo "Examples:" + echo " $0 # Run all tests" + echo " $0 --impl-select js-libp2p # Test only js-libp2p" + echo " $0 --transport-ignore '!tcp' # Test only TCP transport" + echo " $0 --debug --workers 1 # Debug mode, single worker" + exit 0 + ;; + *) + echo "Unknown option: $1" >&2 + echo "Use --help for usage information" >&2 + exit 1 + ;; + esac +done + +# ============================================================================= +# STEP 4: DEPENDENCY CHECKS +# ============================================================================= + +# Source common libraries +source "${SCRIPT_LIB_DIR}/lib-common-init.sh" +source "${SCRIPT_LIB_DIR}/lib-output-formatting.sh" + +# Check dependencies +"${SCRIPT_LIB_DIR}/check-dependencies.sh" || exit 1 + +# ============================================================================= +# STEP 5: GENERATE INPUTS.YAML +# ============================================================================= + +source "${SCRIPT_LIB_DIR}/lib-inputs-yaml.sh" +generate_inputs_yaml "${TEST_ROOT}/inputs.yaml" "${TEST_TYPE}" "${ORIGINAL_ARGS[@]}" + +# ============================================================================= +# STEP 6: GENERATE TEST MATRIX +# ============================================================================= + +print_message "Generating Echo protocol test matrix..." +"${TEST_ROOT}/lib/generate-tests.sh" + +# ============================================================================= +# STEP 7: BUILD IMAGES +# ============================================================================= + +print_message "Building Docker images..." +source "${SCRIPT_LIB_DIR}/lib-image-building.sh" +build_all_images + +# ============================================================================= +# STEP 8: GENERATE DOCKER COMPOSE FILES +# ============================================================================= + +print_message "Generating Docker Compose files..." +source "${SCRIPT_LIB_DIR}/lib-test-caching.sh" + +CACHE_KEY=$(compute_test_cache_key) +TEST_MATRIX_FILE="${CACHE_DIR}/test-matrix/echo-${CACHE_KEY}.yaml" + +# Generate compose files for each test +while IFS= read -r test_name; do + COMPOSE_FILE="${CACHE_DIR}/test-docker-compose/echo-${CACHE_KEY}-${test_name}.yaml" + mkdir -p "$(dirname "${COMPOSE_FILE}")" + + # Generate basic compose file (simplified for echo tests) + cat > "${COMPOSE_FILE}" << EOF +version: '3.8' +services: + redis: + image: redis:alpine + networks: + - echo-test + + server: + image: \${SERVER_IMAGE} + depends_on: + - redis + environment: + - REDIS_ADDR=redis://redis:6379 + networks: + - echo-test + + client: + image: \${CLIENT_IMAGE} + depends_on: + - server + environment: + - REDIS_ADDR=redis://redis:6379 + networks: + - echo-test + +networks: + echo-test: + driver: bridge +EOF +done < <(yq eval '.tests[].name' "${TEST_MATRIX_FILE}") + +# ============================================================================= +# STEP 9: RUN TESTS +# ============================================================================= + +print_message "Running Echo protocol tests..." + +# Create results directory +RESULTS_DIR="${TEST_ROOT}/results" +mkdir -p "${RESULTS_DIR}" + +# Run tests in parallel +export -f run_single_test +readarray -t test_names < <(yq eval '.tests[].name' "${TEST_MATRIX_FILE}") + +if [[ "${WORKERS}" -eq 1 ]] || [[ "${DEBUG}" == "true" ]]; then + # Sequential execution for debugging + for test_name in "${test_names[@]}"; do + run_single_test "${test_name}" + done +else + # Parallel execution + printf '%s\n' "${test_names[@]}" | xargs -P "${WORKERS}" -I {} bash -c 'run_single_test "$@"' _ {} +fi + +# ============================================================================= +# STEP 10: GENERATE DASHBOARD +# ============================================================================= + +print_message "Generating test dashboard..." +"${TEST_ROOT}/lib/generate-dashboard.sh" "${RESULTS_DIR}" + +# ============================================================================= +# STEP 11: SUMMARY +# ============================================================================= + +TOTAL_TESTS=$(find "${RESULTS_DIR}" -name "*.json" | wc -l) +PASSED_TESTS=$(find "${RESULTS_DIR}" -name "*.json" -exec grep -l '"result": "PASS"' {} \; | wc -l) +FAILED_TESTS=$((TOTAL_TESTS - PASSED_TESTS)) + +print_message "Echo Protocol Test Results:" +print_message " Total: ${TOTAL_TESTS}" +print_message " Passed: ${PASSED_TESTS}" +print_message " Failed: ${FAILED_TESTS}" + +if [[ "${FAILED_TESTS}" -gt 0 ]]; then + print_message "❌ Some tests failed - check ${RESULTS_DIR}/echo-dashboard.html" + exit 1 +else + print_message "✅ All tests passed!" + exit 0 +fi + +# ============================================================================= +# HELPER FUNCTIONS +# ============================================================================= + +run_single_test() { + local test_name="$1" + "${TEST_ROOT}/lib/run-single-test.sh" "${test_name}" > "${RESULTS_DIR}/${test_name}.json" +} \ No newline at end of file diff --git a/echo/test-echo.sh b/echo/test-echo.sh new file mode 100755 index 000000000..1c372f664 --- /dev/null +++ b/echo/test-echo.sh @@ -0,0 +1,39 @@ +#!/bin/bash + +set -euo pipefail + +echo "Starting Echo Interop Test..." + +# Create a network for the containers +docker network create echo-test-network 2>/dev/null || true + +# Start Redis if not already running +if ! docker ps | grep -q redis-echo-test; then + echo "Starting Redis..." + docker run -d --name redis-echo-test --network echo-test-network -p 6379:6379 redis:alpine + sleep 3 +fi + +# Start JS echo server +echo "Starting JS libp2p echo server..." +JS_CONTAINER=$(docker run -d --name js-echo-server --network echo-test-network \ + -e REDIS_ADDR=redis://redis-echo-test:6379 \ + js-libp2p-echo:v1.x) + +# Wait for server to start and publish its multiaddr +echo "Waiting for server to start..." +sleep 5 + +# Run Python client test +echo "Running Python libp2p echo client test..." +docker run --rm --name py-echo-client --network echo-test-network \ + -e REDIS_ADDR=redis://redis-echo-test:6379 \ + py-libp2p-echo:v0.x + +# Cleanup +echo "Cleaning up..." +docker stop js-echo-server redis-echo-test 2>/dev/null || true +docker rm js-echo-server redis-echo-test 2>/dev/null || true +docker network rm echo-test-network 2>/dev/null || true + +echo "Echo interop test completed!" \ No newline at end of file