Skip to content

Add Stratum V2 (SV2) Support to DATUM — request for **Concept ACK** #146

@electricalgrade

Description

@electricalgrade

Summary:
I’m proposing native Stratum v2 (SV2) support in DATUM, with a minimal, well-scoped C library (wire + helpers), a thin server/adapter inside DATUM, I’m developing this work in a separate repo first and will integrate once stabilized. I am also adding external SV1↔SV2 bridge (Python) so existing SV1 miners can connect without firmware changes for my testing environment.

I’m asking maintainers for a Concept ACK on the overall approach and surface area before I proceed to a PR series.


Motivation

  • SV2 improves security and efficiency (binary protocol; message framing; job separation). Ideally this is not much for existing Datum gateway as pool to gateway is either secure connection or local.
  • DATUM benefits from direct SV2 upstream/downstream compatibility while keeping existing Stratum V1 paths.
  • A bridge lets current SV1 miners to be used in testing

Scope (MVP)

Protocol coverage (initial):

  • Common: SetupConnection / SetupConnection.Success (and basic error).

  • Mining (channel-scoped):

    • Open channel: OpenExtendedMiningChannel / OpenExtendedMiningChannel.Success (or Standard as fallback)
    • Job broadcast: SetNewPrevHash, NewExtendedMiningJob
    • Share path: SubmitSharesExtended, SubmitShares.Success/Error
  • Optional (later): Reconnect, ChannelEndpointChanged, SetTarget, heartbeats, JD capabilities.

Out of scope : Template Distribution and Job Declaration subprotocols. These are unncessary in DATUM


High-Level Design

1) Small SV2 library (“DATUM SV2 core”)

C, self-contained, no external deps beyond libc. Files:

src/sv2/
  sv2_wire.{h,c}       # frame header, STR0_255/B0_255 helpers, LE/U24, len-prefixed I/O (demo)
  sv2_common.{h,c}     # SetupConnection + common messages (enc/dec + tiny send wrappers)
  sv2_mining.{h,c}     # Mining messages (Open channel, SetNewPrevHash, NewExtendedMiningJob, SubmitShares*)
  sv2_adapter.{h,c}    # Tiny evented server: accept(), parse frames, invoke callbacks, send replies
tests/
  test_sv2_adapter.c   # smoketest for setup/open/submit flow; kqueue/epoll portability

Key choices:

  • Binary framing: u16 ext | u8 msg | u24 len | payload. Outer demo I/O uses u32 length (LE) prefix; DATUM can keep that or replace with its own TCP multiplexer.
  • Endian: SV2 header size and U24 are little-endian; helpers included.
  • Portability: epoll on Linux, kqueue on macOS/BSD (already handled).
  • Shared lib (optional): libsv2wire.{so|dylib} via make ffi for Python bridge.

2) DATUM integration shim (server/adapter)

A minimal C API for DATUM to use:

// datum_sv2.h (proposed)
typedef struct datum_sv2_server datum_sv2_server_t;

typedef struct {
  // notify when a client fully opens a channel
  void (*on_channel_open)(int fd, uint32_t channel_id, uint16_t extranonce2_size);

  // share submission from SV2 client
  void (*on_submit_ext)(uint32_t channel_id, uint32_t job_id,
                        uint32_t nonce, uint32_t ntime, uint32_t version,
                        const uint8_t *en2, uint16_t en2_len);

  void (*on_disconnect)(int fd);
} datum_sv2_handlers_t;

datum_sv2_server_t* datum_sv2_start(const char *bind, uint16_t port,
                                    const datum_sv2_handlers_t *h);

// broadcast SetNewPrevHash (clean=true)
int datum_sv2_broadcast_prevhash(datum_sv2_server_t* s,
                                 uint32_t job_id,
                                 const uint8_t prevhash[32],
                                 uint32_t ntime);

// broadcast NewExtendedMiningJob (paired with prevhash)
int datum_sv2_broadcast_new_job_ext(datum_sv2_server_t* s,
                                    uint32_t job_id,
                                    uint32_t version,
                                    const uint8_t merkle_root[32],
                                    const uint8_t *coinb1, uint32_t coinb1_len,
                                    const uint8_t *coinb2, uint32_t coinb2_len,
                                    const uint8_t nbits[4],
                                    uint8_t clean_jobs);

void datum_sv2_stop(datum_sv2_server_t* s);

Where it plugs into DATUM:

  • Job broadcast: When DATUM builds a new job (or new prevhash), call the two broadcast helpers with coinb1/coinb2, nbits, version, merkle_root derived from existing template data (these are already constructed for SV1). The adapter sends SV2 SetNewPrevHash then NewExtendedMiningJob.
  • Share path: On on_submit_ext, DATUM reuses existing SV1 validation pipeline (reconstruct header, check hashes/target, stale/time, dupe check). If block, submit with the current submitblock path (already in place). Shares get accounted under the SV2 channel id.

Config proposal:

# datum.conf (example)
stratum_v2_enable = true
stratum_v2_listen = 0.0.0.0:3334
# Optional minimum diff for SV2 peers (bridges etc.)
stratum_v2_min_diff = 1

Build flag: ENABLE_SV2=1 (default off), adds the new files to build.

3) Optional external SV1↔SV2 Bridge (Python)

This is only for my testing. Files:

sv1_to_sv2_bridge/
  README.md
  main.py              # loads config.yaml, runs SV1 server + connects to SV2 upstream
  config.yaml          # listen host/port, upstream host/port, flags, hashrate, lib path
  jsonrpc.py           # tiny JSON-RPC helpers
  bridge_core.py       # SV2<->SV1 translation utils + NotifyState
  sv1_server.py        # SV1 server (subscribe/authorize/notify/submit)
  sv2_upstream.py      # SV2 client: connect, setup, open channel, recv jobs, submit shares
  sv2ffi.py            # ctypes wrapper for libsv2wire
  tools/sv2_dummy_server.py   # local SV2 dummy for testing framing
  • Accepts SV1 miners on (e.g.) 0.0.0.0:13333

  • Connects upstream to DATUM SV2 on (e.g.) 127.0.0.1:3334

  • Translates:

    • SV2 SetNewPrevHash + NewExtendedMiningJob → SV1 mining.notify
    • SV1 mining.submit → SV2 SubmitSharesExtended

This bridge is not required for native SV2 miners, but eases testing.


Current Status (WIP)

  • SV2 core library (sv2_wire, sv2_common, sv2_mining) with header parsing, enc/dec helpers, constants.
  • Adapter (sv2_adapter): accepts connections, basic event loop, callbacks on channel open & share submit.
  • Makefile: builds demo sv2_server/sv2_client, unit test, and libsv2wire (make ffi).
  • Portability: epoll (Linux) / kqueue (macOS) handled; ssize_t includes fixed.
  • Python bridge: SV1 server tested with cpuminer; dummy SV2 upstream verified; notify/submits flowing end-to-end in test.
  • 🔜 Full SV2 implementation Implement full SV2 mining protocol support.
  • 🔜 Testing: Standalone testing with FULL SV2 implementation and multiple miners.
  • 🔜 DATUM wiring: map DATUM’s existing job construction (coinbase/merkle) to NewExtendedMiningJob, wire share submit callback to existing validation + submitblock path.
  • 🔜 DATUM Testing: Full Testing

(I can link the WIP repo here once you’re open to the direction.)


Testing Plan

Unit / library

  • Encode/Decode round-trips for each SV2 message (happy path + bounds).
  • Fuzzish length checks (reject truncated/oversized frames).
  • Cross-platform event loop smoketests (tests/test_sv2_adapter.c).

Integration (local)

  • sv2_dummy_server.py ↔ C client (handshake/open/notify/submit).
  • Python bridge: cpuminer → SV1 bridge → dummy SV2 → verify submits and notifies.

Integration (DATUM)

  • DATUM (SV2) ↔ Python bridge ↔ cpuminer (SV1). Confirm:

    • miners receive valid mining.notify
    • submit path → SV2 SubmitSharesExtended → DATUM share pipeline
    • job switch on new block / prevhash updates
  • Eventually, native SV2 firmware (e.g., BraiinsOS+) connecting directly to DATUM (no bridge).

Perf / stability

  • Run bridge and DATUM SV2 under load with configurable clients and paced job updates.
  • Validate no regressions to existing SV1 path.

Milestones & PR Strategy

  1. PR 1: Add src/sv2/* library (no DATUM linkage), ENABLE_SV2 build flag, tests, and ffi target.
  2. PR 2: Add datum_sv2.{h,c} integration shim + config read; launch listener behind stratum_v2_enable.
  3. PR 3: Wire job broadcast (prevhash + new job) from existing template pipeline.
  4. PR 4: Wire share submit callback → current validation + submitblock; metrics & logging.
  5. PR 5: Docs + examples; optional bridge usage guide.

I’ll keep functional changes small and reviewable per PR.


API Details (enc/dec we use)

  • Common

    • SetupConnection / SetupConnection.Success
  • Mining (extended)

    • OpenExtendedMiningChannel / .Success (fields: channel_id, extranonce2_size, target)
    • SetNewPrevHash (fields: job_id, prev_hash, ntime, clean_jobs=true)
    • NewExtendedMiningJob (fields: job_id, version, merkle_root, nbits, coinb1, coinb2, clean_jobs)
    • SubmitSharesExtended (fields: channel_id, job_id, nonce, ntime, version, extranonce2) → .Success/.Error

We’ll keep the binary wire minimal and aligned with the upstream SV2 draft used by open-source projects.


Configuration (examples)

# sv1_to_sv2_bridge/config.yaml
listen: "0.0.0.0:13333"
sv2_upstream: "127.0.0.1:3334"
client_id: "datum-bridge"
sv2_flags: 0
hashrate_ths: 100.0
sv2_lib_path: "./libsv2wire.dylib"   # or .so
# datum.conf
stratum_v2_enable = true
stratum_v2_listen = 0.0.0.0:3334
stratum_v2_min_diff = 1

Open Questions for Maintainers

  • Surface area: Is the proposed datum_sv2_* API minimal/comfortable for DATUM’s architecture?
  • Job data: Are coinb1/coinb2/merkle_root/nbits/version readily available in the current job path (they appear to be for SV1 already)?
  • Channel type: Would you prefer we ship Extended channel only in MVP, or support Standard fallback in v1?
  • Config defaults: Preferred default port & naming for SV2 listener?
  • Submit accounting: Any particular fields you want surfaced for stats/telemetry beyond what SV1 already tracks?

Ask

Please let me know if you Concept ACK this direction. If so, I’ll proceed with PR 1 (library drop + tests + build flag) and keep changes incremental and reviewable. I’ll link my working repo/branch here as soon as you’re okay with the plan.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions