Skip to content

oakwoodgates/NTP

Repository files navigation

Nautilus Trading Platform

A crypto algorithmic trading platform built on NautilusTrader, with custom Actors for persistence and alerting, research tooling for strategy validation, and Grafana for monitoring.

Architecture

Modular monolith, event-driven. NautilusTrader is the core engine (installed as a pip dependency). Everything else β€” Actors, persistence, alerting, research tooling, API, frontend β€” is custom code that orchestrates NT.

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  Phase 3a (current)
β”‚              Jupyter Research Notebooks               β”‚
β”‚   Sweep β†’ Parquet Β· Compare Β· Validate Β· Charts      β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                       β”‚ backtesting/engine.py
                       β”‚ data/sweeps/*.parquet
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚              NautilusTrader Engine                    β”‚
β”‚                                                      β”‚
β”‚  Strategies Β· Actors Β· RiskEngine Β· ExecutionEngine  β”‚
β”‚  BacktestEngine Β· TradingNode Β· MessageBus           β”‚
β”‚  Exchange Adapters (Hyperliquid, Binance, Bybit...)  β”‚
β”‚  ParquetDataCatalog Β· FillModel Β· Portfolio          β”‚
β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
       β”‚                                   β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”              β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   PostgreSQL    β”‚              β”‚      Redis          β”‚
β”‚  + TimescaleDB  β”‚              β”‚  Cache              β”‚
β”‚                 β”‚              β”‚                     β”‚
β”‚ Fills, positionsβ”‚              β”‚ Live state (NT)     β”‚
β”‚ Account history β”‚              β”‚                     β”‚
β”‚ Strategy meta   β”‚              β”‚                     β”‚
β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜              β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
       β”‚ writes                   β”‚ alerts
β”Œβ”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ PersistenceActorβ”‚    β”‚    AlertActor        β”‚  Phase 2 (complete)
β”‚ (inside node)   β”‚    β”‚   (inside node)      β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                 β”‚ Telegram
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚     Grafana      β”‚  Phase 2 β€” reads PostgreSQL
β”‚  Balance Β· PnL   β”‚  Not locked in
β”‚  Fills Β· Stats   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  Phase 3b (future)
β”‚  React Frontend ←WS/RESTβ†’ FastAPI ←Redis Streamsβ†’   β”‚
β”‚  StreamingActor (inside TradingNode)                 β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Key Design Decisions

  • NT as a library, not a fork. We subclass Strategy, Actor, configure engines, call node.run(). NT's repo is never modified.
  • Actors are the extension point. PersistenceActor and AlertActor live inside the TradingNode process, subscribe to NT's MessageBus, and do the work (DB writes, Telegram) via run_in_executor() β€” I/O runs in a thread pool without blocking the event loop.
  • PostgreSQL + TimescaleDB for all persistent data. Prices stored as NUMERIC β€” never floats.
  • Parquet for sweep results. Parameter sweep outputs persist to data/sweeps/ as Parquet files, one per strategy Γ— instrument Γ— interval. No database needed for research data β€” files on disk, read back with load_sweeps().
  • Redis for real-time layer. NT uses it natively for cache; we add a StreamingActor in Phase 3b for bridging trade events to the frontend.
  • NT's ParquetDataCatalog for feeding historical data to the backtester. Coexists with TimescaleDB (Parquet for NT, TimescaleDB for API queries).
  • Grafana for ambient monitoring in Phase 2. Reads PostgreSQL directly. Not locked in β€” replace with any tool at any time without touching the persistence layer.
  • Research before UI. Phase 3a delivers research tooling and strategy validation. The custom React frontend is Phase 3b, built when multiple validated strategies are running live and Grafana isn't enough.
  • Event-driven everywhere. NT's MessageBus is the backbone. Custom Actors bridge events to persistence and the frontend. No polling loops.

Project Structure

β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ strategies/          # NT Strategy subclasses
β”‚   β”‚   β”œβ”€β”€ ma_cross.py         # Unified MA crossover (EMA/SMA/HMA/DEMA/AMA/VIDYA)
β”‚   β”‚   β”œβ”€β”€ bb_meanrev.py
β”‚   β”‚   β”œβ”€β”€ donchian_breakout.py
β”‚   β”‚   β”œβ”€β”€ ma_cross_atr.py            # MA crossover + ATR bracket TP/SL (all MA types)
β”‚   β”‚   β”œβ”€β”€ ma_cross_bracket.py          # MA regime + symmetric ATR bracket exits (all MA types)
β”‚   β”‚   β”œβ”€β”€ ma_cross_long_only.py       # Long-only MA crossover (all MA types)
β”‚   β”‚   β”œβ”€β”€ ma_cross_stop_entry.py          # MA regime + breakout entry + trailing stop (all MA types)
β”‚   β”‚   β”œβ”€β”€ ma_cross_tp.py             # MA crossover + pct take-profit (all MA types)
β”‚   β”‚   β”œβ”€β”€ ma_cross_trailing_stop.py    # MA crossover + ATR trailing stop (all MA types)
β”‚   β”‚   β”œβ”€β”€ macd_rsi.py
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ actors/              # Custom NT Actors
β”‚   β”‚   β”œβ”€β”€ persistence.py   # PersistenceActor β€” writes fills/positions to PostgreSQL
β”‚   β”‚   └── alert.py         # AlertActor β€” Telegram notifications
β”‚   β”œβ”€β”€ backtesting/         # Backtest orchestration
β”‚   β”‚   └── engine.py        # make_engine, run_single_backtest, run_sweep,
β”‚   β”‚                        # load_sweeps, run_walk_forward
β”‚   β”œβ”€β”€ persistence/         # SQLAlchemy Core table definitions (no ORM)
β”‚   β”‚   └── schema.py
β”‚   β”œβ”€β”€ config/              # Pydantic Settings
β”‚   β”‚   └── settings.py      # get_settings() β€” single source of truth
β”‚   β”œβ”€β”€ api/                 # FastAPI application (Phase 3b)
β”‚   └── core/                # Type aliases, constants, instruments, pure utils
β”‚       β”œβ”€β”€ constants.py
β”‚       β”œβ”€β”€ instruments.py
β”‚       └── utils.py
β”œβ”€β”€ grafana/
β”‚   β”œβ”€β”€ provisioning/        # Declarative datasource + dashboard config
β”‚   └── dashboards/          # Dashboard JSON (committed)
β”œβ”€β”€ notebooks/               # Jupyter research + validation
β”‚   β”œβ”€β”€ backtest/             # Per-strategy backtest + sweep notebooks
β”‚   β”‚   β”œβ”€β”€ ema_cross.ipynb          # EMA crossover backtest + sweep
β”‚   β”‚   β”œβ”€β”€ sma_cross.ipynb          # SMA crossover
β”‚   β”‚   β”œβ”€β”€ hma_cross.ipynb          # HMA (Hull) crossover
β”‚   β”‚   β”œβ”€β”€ dema_cross.ipynb         # DEMA (Double EMA) crossover
β”‚   β”‚   β”œβ”€β”€ ama_cross.ipynb          # AMA (Kaufman Adaptive) crossover
β”‚   β”‚   β”œβ”€β”€ vidya_cross.ipynb        # VIDYA crossover
β”‚   β”‚   β”œβ”€β”€ ema_cross_atr.ipynb      # MA crossover + ATR bracket (all MA types)
β”‚   β”‚   β”œβ”€β”€ ema_cross_bracket.ipynb  # MA regime + ATR bracket (all MA types)
β”‚   β”‚   β”œβ”€β”€ ema_cross_long_only.ipynb
β”‚   β”‚   β”œβ”€β”€ ema_cross_stop_entry.ipynb
β”‚   β”‚   β”œβ”€β”€ ema_cross_tp.ipynb
β”‚   β”‚   β”œβ”€β”€ ema_cross_trailing_stop.ipynb
β”‚   β”‚   β”œβ”€β”€ bb_meanrev.ipynb
β”‚   β”‚   β”œβ”€β”€ macd_rsi.ipynb
β”‚   β”‚   └── donchian_breakout.ipynb
β”‚   β”œβ”€β”€ verify/               # Data pipeline + signal verification
β”‚   β”‚   β”œβ”€β”€ 01_pipeline.ipynb        # Data pipeline verification
β”‚   β”‚   β”œβ”€β”€ 02_data.ipynb            # Catalog vs exchange spot-checks
β”‚   β”‚   β”œβ”€β”€ 03_signals.ipynb         # Indicator / signal verification
β”‚   β”‚   └── 04_persistence.ipynb     # DB persistence verification
β”‚   β”œβ”€β”€ compare_sweeps.ipynb       # Cross-instrument/timeframe comparison
β”‚   β”œβ”€β”€ validate_strategy.ipynb    # Walk-forward, plateau, bootstrap
β”‚   β”œβ”€β”€ review_live_run.ipynb      # Post-run analysis of live/paper trades
β”‚   β”œβ”€β”€ charts.py                  # Plotting helpers (plotly, matplotlib, TVLC reports)
β”‚   └── utils.py                   # Shared notebook helpers (make_instrument_id, save_tearsheet,
β”‚                                  #   save_notebook, save_notebook_html)
β”œβ”€β”€ scripts/
β”‚   β”œβ”€β”€ _catalog.py            # Shared utilities for data fetch scripts (crash-safe writes)
β”‚   β”œβ”€β”€ fetch_hl_candles.py    # Hyperliquid OHLCV data fetcher
β”‚   β”œβ”€β”€ fetch_binance_candles.py # Binance OHLCV data fetcher (Futures + Spot via --market)
β”‚   β”œβ”€β”€ run_sandbox.py         # Paper trading runner (SandboxExecutionClient)
β”‚   └── run_live.py            # Live trading runner (HyperliquidExecClient)
β”œβ”€β”€ data/
β”‚   β”œβ”€β”€ catalog/             # ParquetDataCatalog root (gitignored)
β”‚   └── sweeps/              # Sweep result Parquet files (gitignored)
β”œβ”€β”€ reports/                 # Generated reports (gitignored)
β”‚   β”œβ”€β”€ backtest/            # TradingView Lightweight Charts HTML reports
β”‚   β”œβ”€β”€ html/                # Exported notebook HTML snapshots
β”‚   β”œβ”€β”€ notebooks/           # Copied notebook snapshots (.ipynb)
β”‚   └── tearsheets/          # NT tearsheet HTML (saved when SAVE_TEARSHEET=True)
β”œβ”€β”€ tests/
β”‚   β”œβ”€β”€ unit/
β”‚   β”‚   β”œβ”€β”€ test_core.py
β”‚   β”‚   β”œβ”€β”€ test_catalog.py    # Crash-safe write recovery + swap tests
β”‚   β”‚   β”œβ”€β”€ test_schema.py
β”‚   β”‚   β”œβ”€β”€ test_settings.py
β”‚   β”‚   └── test_actors.py
β”‚   └── integration/
β”œβ”€β”€ alembic/                 # DB migrations
β”œβ”€β”€ frontend/                # React application (Phase 3b)
β”œβ”€β”€ pyproject.toml
β”œβ”€β”€ Dockerfile               # Trader container (run_sandbox.py / run_live.py)
β”œβ”€β”€ docker-entrypoint.sh     # Entrypoint β€” passthrough for ad-hoc cmds, exec Python as PID 1
β”œβ”€β”€ .dockerignore
β”œβ”€β”€ docker-compose.yml       # PostgreSQL + TimescaleDB + Redis + Grafana + trader
β”œβ”€β”€ .env.example             # Secrets template (committed)
β”œβ”€β”€ CLAUDE.md
└── README.md

Dependency Direction

Dependencies flow inward β€” outer layers depend on inner layers, never the reverse:

core/                       ← depends on nothing internal
  ↑
strategies/, actors/        ← depend on core/ only
  ↑
backtesting/, persistence/  ← depend on core/ only
  ↑
api/                        ← outermost layer, can import from anything

core/ is kept intentionally tight: NT type aliases, constants, interface protocols (typing.Protocol), and pure utility functions. No business logic, no DB code, no API schemas.

Prerequisites

  • Python 3.12+ (NT requirement)
  • Docker + Docker Compose (PostgreSQL + TimescaleDB, Redis, Grafana)
  • A Hyperliquid wallet private key (for live/paper trading)

Setup

Phase 1 (complete) β€” NT native workflow

git clone <repo-url>
cd NTP
python -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"

# Fetch historical data for backtesting (run from project root)
python scripts/fetch_hl_candles.py               # Hyperliquid candles
python scripts/fetch_binance_candles.py           # Binance Futures candles (may need VPN)
python scripts/fetch_binance_candles.py --market spot  # Binance Spot candles

jupyter notebook notebooks/

Phase 2 (complete) β€” Paper + live trading

cp .env.example .env
# Edit .env β€” fill in POSTGRES_PASSWORD, TELEGRAM_TOKEN, HL credentials

# Build trader image
docker compose build trader

# Run migrations (first time only)
docker compose run --rm trader alembic upgrade head

# Start everything β€” infra + trader container
docker compose up -d

# Tail trader logs
docker compose logs -f trader --tail 200

# Monitoring
open http://localhost:3000   # Grafana (admin / your GRAFANA_PASSWORD)

To run the trader natively instead (quick iteration / debugging):

docker compose up -d postgres redis grafana
alembic upgrade head
python scripts/run_sandbox.py

Phase 3a (current) β€” Research + validation

# Already set up from Phase 1. Just open notebooks:
jupyter notebook notebooks/

# Workflow:
# 1. backtest_*.ipynb β†’ run_sweep() β†’ data/sweeps/*.parquet
# 2. compare_sweeps.ipynb β†’ load_sweeps() β†’ side-by-side analysis
# 3. validate_strategy.ipynb β†’ walk-forward + plateau + bootstrap

Usage

Develop and validate strategies (Phase 3a)

This is the current focus. The research workflow:

  1. Write a strategy in src/strategies/. Subclass Strategy, implement on_start() and on_bar().
  2. Sweep parameters in a backtest_*.ipynb notebook using run_sweep(). Results auto-save to data/sweeps/.
  3. Compare across instruments and timeframes. Open compare_sweeps.ipynb, call load_sweeps(). Review side-by-side heatmaps and parameter stability.
  4. Validate before paper trading. Open validate_strategy.ipynb. Run plateau detection (are best params robust?), walk-forward analysis (do they work out-of-sample?), and bootstrap confidence intervals (is the result statistically reliable?).
  5. Paper trade validated strategies via Phase 2 infrastructure.

Run paper trading (Phase 2)

Requires infrastructure running first (docker compose up -d + migrations).

Docker (recommended for multi-day runs):

docker compose up -d          # starts infra + trader container; auto-restarts on crash
docker compose logs -f trader  # tail logs
docker compose stop trader     # graceful shutdown (SIGTERM β†’ node.stop() β†’ DB updated)

Native (quick iteration):

docker compose up -d postgres redis grafana
python scripts/run_sandbox.py  # Ctrl+C for graceful shutdown

Uses NT's SandboxExecutionClient against live Hyperliquid market data. Every fill and closed position persists to PostgreSQL via PersistenceActor. Telegram alerts fire on fills and position changes (if TELEGRAM_TOKEN and TELEGRAM_CHAT_ID are set in .env). Monitor at http://localhost:3000.

Run live trading (Phase 2 β€” after paper validation)

# Native β€” interactive confirmation prompt
# Requires HL_TESTNET=false + HL_PRIVATE_KEY in .env
python scripts/run_live.py

# Docker β€” set in .env: TRADING_SCRIPT=scripts/run_live.py, HL_TESTNET=false,
#           LIVE_CONFIRM=yes, HL_PRIVATE_KEY=<key>
docker compose restart trader

Development Phases

Phase Focus Status
1 Strategy development + backtesting (NT native workflow, Jupyter) βœ… Complete
2 TradingNode deployment, PersistenceActor, AlertActor, paper + live trading βœ… Complete
3a Research tooling β€” sweep persistence, cross-sweep comparison, walk-forward validation, bootstrap CI 🟑 Active
3b Web layer β€” FastAPI gateway, React frontend, StreamingActor, Redis Streams ⬜ Future
4 ML integration (feature engineering, model training, inference in callbacks) ⬜ Planned
5 Experimental (LSTM, LLM sentiment, RL agents) ⬜ Planned

Key Constraints

  • NautilusTrader is pre-v2.0 β€” pin the version, expect API breakage between releases.
  • No floats for prices β€” NT uses 128-bit fixed-point. Maintain this in PostgreSQL (NUMERIC), asyncpg inserts (str(nt_type)), API responses (string-encoded decimals), and frontend.
  • Actor callbacks must never block β€” use self.run_in_executor() for all I/O. Blocking the event loop stalls the TradingNode.
  • TradingNode is not Jupyter-compatible β€” asyncio event loop conflicts. Run from scripts, not notebooks.
  • The "NT + web dashboard" pattern has no community precedent. When stuck, read NT source code β€” docs and community posts won't cover integration patterns.
  • LGPL-3.0 license β€” NT can be used as a library without affecting your project's license, but modifications to NT's own source must be shared.

About

Platform to experiment with NautilusTrader

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors