A crypto algorithmic trading platform built on NautilusTrader, with custom Actors for persistence and alerting, research tooling for strategy validation, and Grafana for monitoring.
Modular monolith, event-driven. NautilusTrader is the core engine (installed as a pip dependency). Everything else β Actors, persistence, alerting, research tooling, API, frontend β is custom code that orchestrates NT.
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ Phase 3a (current)
β Jupyter Research Notebooks β
β Sweep β Parquet Β· Compare Β· Validate Β· Charts β
ββββββββββββββββββββββββ¬βββββββββββββββββββββββββββββββ
β backtesting/engine.py
β data/sweeps/*.parquet
ββββββββββββββββββββββββ΄βββββββββββββββββββββββββββββββ
β NautilusTrader Engine β
β β
β Strategies Β· Actors Β· RiskEngine Β· ExecutionEngine β
β BacktestEngine Β· TradingNode Β· MessageBus β
β Exchange Adapters (Hyperliquid, Binance, Bybit...) β
β ParquetDataCatalog Β· FillModel Β· Portfolio β
ββββββββ¬ββββββββββββββββββββββββββββββββββββ¬βββββββββββ
β β
ββββββββ΄βββββββββββ βββββββββββ΄βββββββββββ
β PostgreSQL β β Redis β
β + TimescaleDB β β Cache β
β β β β
β Fills, positionsβ β Live state (NT) β
β Account history β β β
β Strategy meta β β β
ββββββββ¬βββββββββββ ββββββββββββββββββββββ
β writes β alerts
ββββββββ΄βββββββββββ ββββββββββββ΄βββββββββββ
β PersistenceActorβ β AlertActor β Phase 2 (complete)
β (inside node) β β (inside node) β
βββββββββββββββββββ βββββββββββββββββββββββ
β Telegram
ββββββββββββββββββββ
β Grafana β Phase 2 β reads PostgreSQL
β Balance Β· PnL β Not locked in
β Fills Β· Stats β
ββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ Phase 3b (future)
β React Frontend βWS/RESTβ FastAPI βRedis Streamsβ β
β StreamingActor (inside TradingNode) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
- NT as a library, not a fork. We subclass
Strategy,Actor, configure engines, callnode.run(). NT's repo is never modified. - Actors are the extension point.
PersistenceActorandAlertActorlive inside the TradingNode process, subscribe to NT's MessageBus, and do the work (DB writes, Telegram) viarun_in_executor()β I/O runs in a thread pool without blocking the event loop. - PostgreSQL + TimescaleDB for all persistent data. Prices stored as
NUMERICβ never floats. - Parquet for sweep results. Parameter sweep outputs persist to
data/sweeps/as Parquet files, one per strategy Γ instrument Γ interval. No database needed for research data β files on disk, read back withload_sweeps(). - Redis for real-time layer. NT uses it natively for cache; we add a
StreamingActorin Phase 3b for bridging trade events to the frontend. - NT's ParquetDataCatalog for feeding historical data to the backtester. Coexists with TimescaleDB (Parquet for NT, TimescaleDB for API queries).
- Grafana for ambient monitoring in Phase 2. Reads PostgreSQL directly. Not locked in β replace with any tool at any time without touching the persistence layer.
- Research before UI. Phase 3a delivers research tooling and strategy validation. The custom React frontend is Phase 3b, built when multiple validated strategies are running live and Grafana isn't enough.
- Event-driven everywhere. NT's MessageBus is the backbone. Custom Actors bridge events to persistence and the frontend. No polling loops.
βββ src/
β βββ strategies/ # NT Strategy subclasses
β β βββ ma_cross.py # Unified MA crossover (EMA/SMA/HMA/DEMA/AMA/VIDYA)
β β βββ bb_meanrev.py
β β βββ donchian_breakout.py
β β βββ ma_cross_atr.py # MA crossover + ATR bracket TP/SL (all MA types)
β β βββ ma_cross_bracket.py # MA regime + symmetric ATR bracket exits (all MA types)
β β βββ ma_cross_long_only.py # Long-only MA crossover (all MA types)
β β βββ ma_cross_stop_entry.py # MA regime + breakout entry + trailing stop (all MA types)
β β βββ ma_cross_tp.py # MA crossover + pct take-profit (all MA types)
β β βββ ma_cross_trailing_stop.py # MA crossover + ATR trailing stop (all MA types)
β β βββ macd_rsi.py
β β βββ ...
β βββ actors/ # Custom NT Actors
β β βββ persistence.py # PersistenceActor β writes fills/positions to PostgreSQL
β β βββ alert.py # AlertActor β Telegram notifications
β βββ backtesting/ # Backtest orchestration
β β βββ engine.py # make_engine, run_single_backtest, run_sweep,
β β # load_sweeps, run_walk_forward
β βββ persistence/ # SQLAlchemy Core table definitions (no ORM)
β β βββ schema.py
β βββ config/ # Pydantic Settings
β β βββ settings.py # get_settings() β single source of truth
β βββ api/ # FastAPI application (Phase 3b)
β βββ core/ # Type aliases, constants, instruments, pure utils
β βββ constants.py
β βββ instruments.py
β βββ utils.py
βββ grafana/
β βββ provisioning/ # Declarative datasource + dashboard config
β βββ dashboards/ # Dashboard JSON (committed)
βββ notebooks/ # Jupyter research + validation
β βββ backtest/ # Per-strategy backtest + sweep notebooks
β β βββ ema_cross.ipynb # EMA crossover backtest + sweep
β β βββ sma_cross.ipynb # SMA crossover
β β βββ hma_cross.ipynb # HMA (Hull) crossover
β β βββ dema_cross.ipynb # DEMA (Double EMA) crossover
β β βββ ama_cross.ipynb # AMA (Kaufman Adaptive) crossover
β β βββ vidya_cross.ipynb # VIDYA crossover
β β βββ ema_cross_atr.ipynb # MA crossover + ATR bracket (all MA types)
β β βββ ema_cross_bracket.ipynb # MA regime + ATR bracket (all MA types)
β β βββ ema_cross_long_only.ipynb
β β βββ ema_cross_stop_entry.ipynb
β β βββ ema_cross_tp.ipynb
β β βββ ema_cross_trailing_stop.ipynb
β β βββ bb_meanrev.ipynb
β β βββ macd_rsi.ipynb
β β βββ donchian_breakout.ipynb
β βββ verify/ # Data pipeline + signal verification
β β βββ 01_pipeline.ipynb # Data pipeline verification
β β βββ 02_data.ipynb # Catalog vs exchange spot-checks
β β βββ 03_signals.ipynb # Indicator / signal verification
β β βββ 04_persistence.ipynb # DB persistence verification
β βββ compare_sweeps.ipynb # Cross-instrument/timeframe comparison
β βββ validate_strategy.ipynb # Walk-forward, plateau, bootstrap
β βββ review_live_run.ipynb # Post-run analysis of live/paper trades
β βββ charts.py # Plotting helpers (plotly, matplotlib, TVLC reports)
β βββ utils.py # Shared notebook helpers (make_instrument_id, save_tearsheet,
β # save_notebook, save_notebook_html)
βββ scripts/
β βββ _catalog.py # Shared utilities for data fetch scripts (crash-safe writes)
β βββ fetch_hl_candles.py # Hyperliquid OHLCV data fetcher
β βββ fetch_binance_candles.py # Binance OHLCV data fetcher (Futures + Spot via --market)
β βββ run_sandbox.py # Paper trading runner (SandboxExecutionClient)
β βββ run_live.py # Live trading runner (HyperliquidExecClient)
βββ data/
β βββ catalog/ # ParquetDataCatalog root (gitignored)
β βββ sweeps/ # Sweep result Parquet files (gitignored)
βββ reports/ # Generated reports (gitignored)
β βββ backtest/ # TradingView Lightweight Charts HTML reports
β βββ html/ # Exported notebook HTML snapshots
β βββ notebooks/ # Copied notebook snapshots (.ipynb)
β βββ tearsheets/ # NT tearsheet HTML (saved when SAVE_TEARSHEET=True)
βββ tests/
β βββ unit/
β β βββ test_core.py
β β βββ test_catalog.py # Crash-safe write recovery + swap tests
β β βββ test_schema.py
β β βββ test_settings.py
β β βββ test_actors.py
β βββ integration/
βββ alembic/ # DB migrations
βββ frontend/ # React application (Phase 3b)
βββ pyproject.toml
βββ Dockerfile # Trader container (run_sandbox.py / run_live.py)
βββ docker-entrypoint.sh # Entrypoint β passthrough for ad-hoc cmds, exec Python as PID 1
βββ .dockerignore
βββ docker-compose.yml # PostgreSQL + TimescaleDB + Redis + Grafana + trader
βββ .env.example # Secrets template (committed)
βββ CLAUDE.md
βββ README.md
Dependencies flow inward β outer layers depend on inner layers, never the reverse:
core/ β depends on nothing internal
β
strategies/, actors/ β depend on core/ only
β
backtesting/, persistence/ β depend on core/ only
β
api/ β outermost layer, can import from anything
core/ is kept intentionally tight: NT type aliases, constants, interface protocols (typing.Protocol), and pure utility functions. No business logic, no DB code, no API schemas.
- Python 3.12+ (NT requirement)
- Docker + Docker Compose (PostgreSQL + TimescaleDB, Redis, Grafana)
- A Hyperliquid wallet private key (for live/paper trading)
git clone <repo-url>
cd NTP
python -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"
# Fetch historical data for backtesting (run from project root)
python scripts/fetch_hl_candles.py # Hyperliquid candles
python scripts/fetch_binance_candles.py # Binance Futures candles (may need VPN)
python scripts/fetch_binance_candles.py --market spot # Binance Spot candles
jupyter notebook notebooks/cp .env.example .env
# Edit .env β fill in POSTGRES_PASSWORD, TELEGRAM_TOKEN, HL credentials
# Build trader image
docker compose build trader
# Run migrations (first time only)
docker compose run --rm trader alembic upgrade head
# Start everything β infra + trader container
docker compose up -d
# Tail trader logs
docker compose logs -f trader --tail 200
# Monitoring
open http://localhost:3000 # Grafana (admin / your GRAFANA_PASSWORD)To run the trader natively instead (quick iteration / debugging):
docker compose up -d postgres redis grafana
alembic upgrade head
python scripts/run_sandbox.py# Already set up from Phase 1. Just open notebooks:
jupyter notebook notebooks/
# Workflow:
# 1. backtest_*.ipynb β run_sweep() β data/sweeps/*.parquet
# 2. compare_sweeps.ipynb β load_sweeps() β side-by-side analysis
# 3. validate_strategy.ipynb β walk-forward + plateau + bootstrapThis is the current focus. The research workflow:
- Write a strategy in
src/strategies/. SubclassStrategy, implementon_start()andon_bar(). - Sweep parameters in a
backtest_*.ipynbnotebook usingrun_sweep(). Results auto-save todata/sweeps/. - Compare across instruments and timeframes. Open
compare_sweeps.ipynb, callload_sweeps(). Review side-by-side heatmaps and parameter stability. - Validate before paper trading. Open
validate_strategy.ipynb. Run plateau detection (are best params robust?), walk-forward analysis (do they work out-of-sample?), and bootstrap confidence intervals (is the result statistically reliable?). - Paper trade validated strategies via Phase 2 infrastructure.
Requires infrastructure running first (docker compose up -d + migrations).
Docker (recommended for multi-day runs):
docker compose up -d # starts infra + trader container; auto-restarts on crash
docker compose logs -f trader # tail logs
docker compose stop trader # graceful shutdown (SIGTERM β node.stop() β DB updated)Native (quick iteration):
docker compose up -d postgres redis grafana
python scripts/run_sandbox.py # Ctrl+C for graceful shutdownUses NT's SandboxExecutionClient against live Hyperliquid market data. Every fill and closed position persists to PostgreSQL via PersistenceActor. Telegram alerts fire on fills and position changes (if TELEGRAM_TOKEN and TELEGRAM_CHAT_ID are set in .env). Monitor at http://localhost:3000.
# Native β interactive confirmation prompt
# Requires HL_TESTNET=false + HL_PRIVATE_KEY in .env
python scripts/run_live.py
# Docker β set in .env: TRADING_SCRIPT=scripts/run_live.py, HL_TESTNET=false,
# LIVE_CONFIRM=yes, HL_PRIVATE_KEY=<key>
docker compose restart trader| Phase | Focus | Status |
|---|---|---|
| 1 | Strategy development + backtesting (NT native workflow, Jupyter) | β Complete |
| 2 | TradingNode deployment, PersistenceActor, AlertActor, paper + live trading | β Complete |
| 3a | Research tooling β sweep persistence, cross-sweep comparison, walk-forward validation, bootstrap CI | π‘ Active |
| 3b | Web layer β FastAPI gateway, React frontend, StreamingActor, Redis Streams | β¬ Future |
| 4 | ML integration (feature engineering, model training, inference in callbacks) | β¬ Planned |
| 5 | Experimental (LSTM, LLM sentiment, RL agents) | β¬ Planned |
- NautilusTrader is pre-v2.0 β pin the version, expect API breakage between releases.
- No floats for prices β NT uses 128-bit fixed-point. Maintain this in PostgreSQL (
NUMERIC), asyncpg inserts (str(nt_type)), API responses (string-encoded decimals), and frontend. - Actor callbacks must never block β use
self.run_in_executor()for all I/O. Blocking the event loop stalls the TradingNode. - TradingNode is not Jupyter-compatible β asyncio event loop conflicts. Run from scripts, not notebooks.
- The "NT + web dashboard" pattern has no community precedent. When stuck, read NT source code β docs and community posts won't cover integration patterns.
- LGPL-3.0 license β NT can be used as a library without affecting your project's license, but modifications to NT's own source must be shared.