feat: session 2 — spawn, timer, monitor, perf optimizations#29
feat: session 2 — spawn, timer, monitor, perf optimizations#29
Conversation
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add build_callback() and spawn_binance_feed() with Arc<AtomicU64> counters for event_count and drop_count. Fix clippy lints in Timestamp::now() (missing panics doc, cast truncation). Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…ssure Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Passive utility struct that tracks event emission counters from feed spawn wrappers and detects staleness. Designed for the engine to call check_all() on each Timer(Periodic) event. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add property-based fuzz tests ensuring BinanceDecoder and PolymarketMarketDecoder never panic on arbitrary byte input. Critical for HFT resilience against malformed market data. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…nused fields - Add sonic-rs as default JSON parser (1.5-2x faster than simd-json by skipping tape intermediate stage). Three-tier fallback: sonic-rs > simd-json > serde_json, controlled by feature flags. - Replace peek_stream() byte scan with combined_stream bool field on BinanceDecoder, set at construction based on mapping count. Eliminates 16% hotspot on single-symbol feeds. - Remove unused T (trade_time) and E (event_time) fields from BinanceBookTicker schema — serde skips unknown fields, saving ~3.6% parse time per message. - Update e2e tests to use combined-stream JSON format for multi-symbol decoders. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Transport Phase C/D completion: - Timestamp::now() (std-gated wall-clock access) - spawn_binance_feed / spawn_polymarket_market_feed with backpressure (FnMut(HotEvent) -> bool, drop_count tracking, FeedSpawnResult) - TimerThread — periodic Timer(Periodic) + Heartbeat events (config validation, Drop impl, interruptible shutdown) - FeedMonitor — stale feed detection via event_count comparison (first-check baseline, recovery, MAX_FEEDS=8) - Book snapshot truncation warning (tracing::warn at 64 levels) Performance optimizations: - sonic-rs as default JSON parser (1.5-2x faster than simd-json, zero internal allocations) - Optimized parse_decimal_bytes: single-pass, i64-only, POW10 lookup (eliminated i128 widening + triple-scan + multiply loop) - Eliminated peek_stream overhead for single-symbol feeds - Removed unused T/E fields from Binance schema Benchmarks (Linux isolated core, criterion): - Binance bookTicker: 272ns/msg (was 307ns, -11%) - Polymarket price_change: 207ns/msg (was 306ns, -32%) Testing: - 18 e2e + stress tests (1M messages, backpressure, multi-symbol) - 7 timer tests, 8 monitor tests - Bolero property tests (arbitrary bytes → never panic) - READMEs for mantis-binance, mantis-polymarket - PROGRESS.md updated with section 1.13 Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 1 minutes and 15 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (5)
📝 WalkthroughWalkthroughAdds workspace Changes
Sequence Diagram(s)sequenceDiagram
participant WS as WebSocket Thread
participant CB as Transport Callback
participant Dec as Decoder
participant Q as SPSC Queue
participant Monitor as FeedMonitor
WS->>CB: read_bytes(&mut buf)
CB->>Dec: decode(&mut buf, timestamp, &mut out)
Dec->>Dec: parse JSON (sonic-rs/simd-json/serde)
Dec->>Dec: symbol/token lookup & convert ticks/lots
Dec-->>CB: return n HotEvent(s)
CB->>CB: event_count += n
loop for each HotEvent
CB->>Q: push(event)
alt push succeeds
Q->>Engine: event available
else push fails
CB->>CB: drop_count += 1
end
end
CB->>Monitor: Monitor observes event_count (periodic)
CB-->>WS: return true
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Benchmark ReportCommit: Sequence Lock (mantis-seqlock)LinuxCPU: seqlock_read_contended
seqlock_read_uncontended
seqlock_write
macOSCPU: seqlock_read_contended
seqlock_read_uncontended
seqlock_write
Fixed-Point Arithmetic (mantis-fixed)LinuxCPU: fixed_checked_add
fixed_checked_div
fixed_checked_mul_trunc
fixed_decimal_parse
fixed_display
fixed_mul_round_vs_trunc
fixed_parse
fixed_rescale
macOSCPU: fixed_checked_add
fixed_checked_div
fixed_checked_mul_trunc
fixed_decimal_parse
fixed_display
fixed_mul_round_vs_trunc
fixed_parse
fixed_rescale
Market-State Engine (mantis-market-state)LinuxCPU: market_state_array_book
market_state_engine
macOSCPU: market_state_array_book
market_state_engine
|
- Fixed parse.rs: backtick doc_markdown, let..else, merged match arms, missing #Errors section on from_str_decimal - Fixed e2e tests: expect_used annotations, unused imports, doc backticks, print_stderr/cast_precision_loss file-level expects Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 5
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
crates/transport/README.md (1)
135-140:⚠️ Potential issue | 🟡 MinorRemove the stale transport-level JSON feature entry.
Earlier sections say transport only forwards raw bytes and venue crates own JSON parsing, so the
simd-jsonrow is misleading here. If parsing has moved out of this crate, the feature table should only document transport features.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/transport/README.md` around lines 135 - 140, Remove the stale transport-level JSON feature entry by deleting the `simd-json` row from the Features table in the README and leaving only transport-relevant features (e.g., the `tuning` row); search for and remove any other references to the `simd-json` feature in this crate's README or feature documentation so the docs reflect that JSON parsing is handled by venue crates, keeping references to `tuning` and any real transport features intact.
🧹 Nitpick comments (2)
crates/types/src/timestamp.rs (1)
103-109: Consider renaming test tonow_is_non_decreasing.
SystemTimeis wall-clock time and not guaranteed to be monotonic (NTP adjustments can cause backward jumps). The test assertionb >= ais correct for a quick sanity check, but the namenow_is_monotonicimplies a stronger guarantee thanSystemTimeprovides. Consider renaming tonow_is_non_decreasing_in_quick_successionor similar to avoid confusion.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/types/src/timestamp.rs` around lines 103 - 109, Rename the test function now_is_monotonic to a name that does not imply a monotonic clock (e.g., now_is_non_decreasing or now_is_non_decreasing_in_quick_succession) to reflect that Timestamp::now() uses wall-clock SystemTime; update the test declaration fn now_is_monotonic() -> fn now_is_non_decreasing() (or chosen name) and keep the body using Timestamp::now() and the same assert to preserve behavior.crates/polymarket/src/market/decoder.rs (1)
175-185: Avoid double truncation warnings for a single snapshot.If bids already hit the 64 cap, the asks loop currently emits the same warning again. Consider logging once per message to reduce noise.
Diff suggestion
let mut count: usize = 0; let total_levels = msg.bids.len() + msg.asks.len(); + let mut truncation_warned = false; for (depth_idx, level) in msg.bids.iter().enumerate() { if count >= 64 { - tracing::warn!( - asset_id = msg.asset_id, - total_levels, - emitted = 64, - "book snapshot truncated at 64 events" - ); + if !truncation_warned { + tracing::warn!( + asset_id = msg.asset_id, + total_levels, + emitted = 64, + "book snapshot truncated at 64 events" + ); + truncation_warned = true; + } break; } @@ for (depth_idx, level) in msg.asks.iter().enumerate() { if count >= 64 { - tracing::warn!( - asset_id = msg.asset_id, - total_levels, - emitted = 64, - "book snapshot truncated at 64 events" - ); + if !truncation_warned { + tracing::warn!( + asset_id = msg.asset_id, + total_levels, + emitted = 64, + "book snapshot truncated at 64 events" + ); + truncation_warned = true; + } break; }Also applies to: 210-218
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/polymarket/src/market/decoder.rs` around lines 175 - 185, The snapshot truncation warning is emitted twice when both bids and asks iterate after count hits 64; modify the logic in the loops that iterate msg.bids and msg.asks inside decoder.rs (the code blocks around the bids loop starting with "for (depth_idx, level) in msg.bids.iter().enumerate()" and the corresponding asks loop) to emit the warning only once per message by introducing a per-message boolean flag (e.g., warned_truncated or emitted_truncation_warn) initialized before the loops, check it before calling tracing::warn!, set it to true when you log, and reuse that flag in both loops so the warning is suppressed on the second truncation attempt.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@crates/binance/README.md`:
- Around line 9-19: The markdown fence that begins with the ASCII diagram
starting "COLD SIDE (transport thread) HOT SIDE (SPSC consumer)" is
missing a language tag; update the opening ``` to ```text so the block is
labeled (e.g., change the fence before that diagram to ```text) to satisfy
markdownlint MD040.
- Around line 54-90: Update the README wording to reflect the current
implementation: remove/replace claims that the decoder calls peek_stream() for
combined-stream detection and that simd-json is the default; instead state that
combined-stream detection is performed outside the decoder (so decoders consume
already-unwrapped events) and that the crate now defaults to the non-simd parser
(list the actual default parser), and update the Features and Combined Stream
Detection sections to reference BinanceDecoder<D>, BinanceSymbolMapping<'a, D>,
and DecoderError only for what they currently do (stateful decoding and error
types) and to note that peek_stream() is no longer part of the decoder path and
the simd-json feature is opt-in rather than the default.
In `@crates/binance/src/decoder.rs`:
- Around line 94-105: The code currently silently truncates m.symbol when its
byte length exceeds MAX_SYMBOL_LEN (16), which can cause lookup failures; update
the decoding logic that handles symbol_names/symbol_lens to first check if
m.symbol.as_bytes().len() > MAX_SYMBOL_LEN and return an
Err(DecoderError::SymbolTooLong) instead of truncating, and add a new
DecoderError::SymbolTooLong variant (with the provided doc comment) to the
DecoderError enum so callers can handle oversized symbols explicitly.
In `@crates/binance/tests/e2e_decode.rs`:
- Around line 168-204: The stress tests (e.g., the test function
stress_1m_messages) should not run as part of regular cargo test; either mark
them to be skipped by default or move them to a proper benchmark harness: add a
skip attribute like #[ignore] or gate them behind a feature (e.g.,
#[cfg_attr(not(feature = "perf"), ignore)] or #[cfg(feature = "perf")]) so
normal tests run fast, or relocate the large-loop/perf logic into a
Criterion/benches benchmark crate and keep only small functional assertions in
the unit tests (apply the same change to the other large/stress tests referenced
in the diff).
In `@crates/transport/src/timer.rs`:
- Around line 118-129: Add a new TimerConfigError::HeartbeatTooLarge variant
and, after computing tick_ns and hb_ns and after the divisibility check (where
TimerConfigError::HeartbeatNotDivisible is returned), check that (hb_ns /
tick_ns) <= u64::MAX; if it exceeds u64::MAX return
TimerConfigError::HeartbeatTooLarge instead of casting; only then perform the
safe cast into heartbeat_every (the variable currently set via (hb_ns / tick_ns)
as u64) so you avoid silent truncation and the potential zero divisor panic when
calling methods like tick_counter.is_multiple_of(heartbeat_every).
---
Outside diff comments:
In `@crates/transport/README.md`:
- Around line 135-140: Remove the stale transport-level JSON feature entry by
deleting the `simd-json` row from the Features table in the README and leaving
only transport-relevant features (e.g., the `tuning` row); search for and remove
any other references to the `simd-json` feature in this crate's README or
feature documentation so the docs reflect that JSON parsing is handled by venue
crates, keeping references to `tuning` and any real transport features intact.
---
Nitpick comments:
In `@crates/polymarket/src/market/decoder.rs`:
- Around line 175-185: The snapshot truncation warning is emitted twice when
both bids and asks iterate after count hits 64; modify the logic in the loops
that iterate msg.bids and msg.asks inside decoder.rs (the code blocks around the
bids loop starting with "for (depth_idx, level) in msg.bids.iter().enumerate()"
and the corresponding asks loop) to emit the warning only once per message by
introducing a per-message boolean flag (e.g., warned_truncated or
emitted_truncation_warn) initialized before the loops, check it before calling
tracing::warn!, set it to true when you log, and reuse that flag in both loops
so the warning is suppressed on the second truncation attempt.
In `@crates/types/src/timestamp.rs`:
- Around line 103-109: Rename the test function now_is_monotonic to a name that
does not imply a monotonic clock (e.g., now_is_non_decreasing or
now_is_non_decreasing_in_quick_succession) to reflect that Timestamp::now() uses
wall-clock SystemTime; update the test declaration fn now_is_monotonic() -> fn
now_is_non_decreasing() (or chosen name) and keep the body using
Timestamp::now() and the same assert to preserve behavior.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 36454b01-efd3-45b3-8424-e2b945b7f603
⛔ Files ignored due to path filters (1)
Cargo.lockis excluded by!**/*.lock
📒 Files selected for processing (27)
Cargo.tomlcrates/bench/Cargo.tomlcrates/bench/benches/decode.rscrates/binance/Cargo.tomlcrates/binance/README.mdcrates/binance/src/decoder.rscrates/binance/src/lib.rscrates/binance/src/schema.rscrates/binance/src/spawn.rscrates/binance/tests/e2e_decode.rscrates/fixed/src/parse.rscrates/polymarket/Cargo.tomlcrates/polymarket/README.mdcrates/polymarket/src/market/decoder.rscrates/polymarket/src/market/mod.rscrates/polymarket/src/market/spawn.rscrates/polymarket/tests/e2e_decode.rscrates/transport/README.mdcrates/transport/src/feed.rscrates/transport/src/lib.rscrates/transport/src/monitor.rscrates/transport/src/timer.rscrates/types/src/timestamp.rscrates/verify/Cargo.tomlcrates/verify/src/decoder_props.rscrates/verify/src/lib.rsdocs/PROGRESS.md
- Add SymbolTooLong error to BinanceDecoder, reject oversized symbols - Add HeartbeatTooLarge bounds check to TimerConfig validation - Mark all stress tests #[ignore = "stress test"] in both venue crates - Deduplicate truncation warning in Polymarket book decoder - Rename now_is_monotonic test to now_is_non_decreasing - Fix binance README: text fence, sonic-rs default, combined_stream docs - Remove stale simd-json entry from transport README features table Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
@coderabbitai All 5 actionable issues + 2 nitpicks addressed in commit 03e5175:
|
|
Tip For best results, initiate chat on the files or code changes.
Everything looks solid. I'll go ahead and resolve the open review comments. [resolve] (∩`-´)⊃━☆゚.*・。゚ |
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## main #29 +/- ##
==========================================
- Coverage 95.57% 95.35% -0.22%
==========================================
Files 77 82 +5
Lines 10009 10893 +884
Branches 10009 10893 +884
==========================================
+ Hits 9566 10387 +821
- Misses 393 447 +54
- Partials 50 59 +9
🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Actionable comments posted: 4
🧹 Nitpick comments (4)
crates/transport/src/timer.rs (1)
299-301: Replace wildcardmatches!patterns with explicit destructuring.At Line 300 and Line 334,
EventBody::Timer(_)/EventBody::Heartbeat(_)violates the project’s Rust pattern-matching rule.Suggested refactor
- let timer_count = collected - .iter() - .filter(|e| matches!(e.body, EventBody::Timer(_))) - .count(); + let timer_count = collected + .iter() + .filter(|e| { + if let EventBody::Timer(timer_payload) = &e.body { + let _explicit_binding = timer_payload; + true + } else { + false + } + }) + .count(); - let hb_count = collected - .iter() - .filter(|e| matches!(e.body, EventBody::Heartbeat(_))) - .count(); + let hb_count = collected + .iter() + .filter(|e| { + if let EventBody::Heartbeat(heartbeat_payload) = &e.body { + let _explicit_binding = heartbeat_payload; + true + } else { + false + } + }) + .count();As per coding guidelines: "No wildcard matches — use explicit destructuring in pattern matching".
Also applies to: 333-335
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/transport/src/timer.rs` around lines 299 - 301, The matches! wildcards should be replaced with explicit destructuring: change uses of matches!(e.body, EventBody::Timer(_)) and matches!(e.body, EventBody::Heartbeat(_)) to pattern-match the inner binding (e.g., match e.body { EventBody::Timer(timer) => true, _ => false } or use if let EventBody::Timer(timer) = &e.body { … }) so the inner value is explicitly named; update both occurrences referencing EventBody::Timer and EventBody::Heartbeat and replace the anonymous `_` with a concrete binding name (e.g., timer, hb) wherever e.body is tested in iterator/filter chains.crates/fixed/src/parse.rs (1)
158-165: Uselet...elsefor the feed fast path.This
matchis only handling the early-return case, solet Ok(next) = acc.feed(bytes[i]) else { return Err(e); };keeps the happy path flatter and matches the repo style.As per coding guidelines "Use
let...elsefor early returns, keep happy path unindented".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/fixed/src/parse.rs` around lines 158 - 165, The loop uses a match to handle the early-return case when calling acc.feed(bytes[i]); replace it with a let...else to keep the happy path flat: call acc.feed(bytes[i]) with let Ok(next) = acc.feed(bytes[i]) else { return Err(e); }; then set acc = next. Update the block surrounding Accumulator::new, acc.feed, bytes, i, and start accordingly so the early error returns via else and the successful path is unindented.crates/polymarket/tests/e2e_decode.rs (2)
35-78: Returnid_upandid_downfromtest_registry().The later assertions are coupled to
InstrumentId::from_raw(1/2)even though this helper already knows the actual inserted IDs. Returning them would make the tests resilient to registry ID allocation changes, and it would also preventstress_500k_alternating_tokensfrom treating any non-up ID as “down”.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/polymarket/tests/e2e_decode.rs` around lines 35 - 78, The helper test_registry currently returns only a leaked InstrumentRegistry but knows the inserted IDs (id_up, id_down); change test_registry to return the two inserted InstrumentId values (e.g., (id_up, id_down) or a small struct) along with or instead of the leaked registry so callers can use the actual ids; update all call sites (tests like stress_500k_alternating_tokens and any assertions that assume InstrumentId::from_raw(1/2)) to consume the returned ids rather than hard-coding raw ids; keep identifiers id_up, id_down, test_registry, and any bindings (bind_polymarket_current / insert) as reference points when making the change.
171-180: Avoid wildcard fallback arms in these assertions.Bind the unexpected variant instead of using
_so failures report what came back and the match stays aligned with the repo rule.As per coding guidelines "No wildcard matches — use explicit destructuring in pattern matching".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/polymarket/tests/e2e_decode.rs` around lines 171 - 180, The assertions use a wildcard fallback arm which hides the actual unexpected variant; change the match arms on out[0].body and out[2].body to bind the unexpected variant (e.g., name it `other`) instead of `_` and include that bound value in the panic message so failures display what was returned; keep the successful arm as `EventBody::BookDelta(bd)` and continue to assert `bd.side == Side::Bid`.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@crates/binance/README.md`:
- Around line 83-84: Update the DecoderError documentation to include the
missing enum variant SymbolTooLong: find the table row that lists DecoderError
(currently showing TooManySymbols and EmptyMappings) and add SymbolTooLong to
the list of possible errors so the README matches the variants exposed by the
DecoderError enum in decoder.rs; ensure the entry format matches the existing
style and, if there is an explanatory paragraph elsewhere for DecoderError, add
a short sentence describing when SymbolTooLong is returned.
In `@crates/fixed/src/parse.rs`:
- Around line 76-92: The code currently increments self.total_digits for every
byte which rejects strings of many leading zeros; change the logic to count
significant magnitude instead of raw digits by only incrementing the digit count
once the mantissa has become non-zero or the current digit is non-zero (i.e.
start counting after the first non-zero digit or once you've already started
counting significant digits). Update the overflow guard to use that significant
count (or a new significant_digits field) when deciding to return
ParseFixedError::Overflow, and keep the existing checked_mul/checked_sub updates
to self.mantissa (in the block around self.total_digits, self.mantissa, and
ParseFixedError::Overflow) so numeric overflow is still detected from arithmetic
growth rather than raw byte length.
In `@crates/transport/README.md`:
- Around line 116-122: The fenced diagram block currently unlabeled (the block
showing "spawn() -> connect -> subscribe -> read loop --> callback" and related
lines) should have a language tag to satisfy MD040; change the opening fence
from ``` to ```text so the block is labeled (e.g., use ```text before the lines
describing spawn(), reconnect/backoff, and shutdown()) to improve markdown
tooling and linting.
- Around line 86-95: The example passed a closure that returns a bool (|event|
ring.try_push(event).is_ok()) to TimerThread::spawn, but spawn expects an
FnMut(HotEvent) -> () unit callback; change the closure used with
TimerThread::spawn so it discards or handles the Result from ring.try_push and
returns unit (for example by ignoring the boolean with let _ =
ring.try_push(event) or matching the result), keeping the same TimerConfig and
callback argument signature; update the snippet around TimerThread::spawn,
TimerConfig, and the closure to ensure the closure returns () instead of bool.
---
Nitpick comments:
In `@crates/fixed/src/parse.rs`:
- Around line 158-165: The loop uses a match to handle the early-return case
when calling acc.feed(bytes[i]); replace it with a let...else to keep the happy
path flat: call acc.feed(bytes[i]) with let Ok(next) = acc.feed(bytes[i]) else {
return Err(e); }; then set acc = next. Update the block surrounding
Accumulator::new, acc.feed, bytes, i, and start accordingly so the early error
returns via else and the successful path is unindented.
In `@crates/polymarket/tests/e2e_decode.rs`:
- Around line 35-78: The helper test_registry currently returns only a leaked
InstrumentRegistry but knows the inserted IDs (id_up, id_down); change
test_registry to return the two inserted InstrumentId values (e.g., (id_up,
id_down) or a small struct) along with or instead of the leaked registry so
callers can use the actual ids; update all call sites (tests like
stress_500k_alternating_tokens and any assertions that assume
InstrumentId::from_raw(1/2)) to consume the returned ids rather than hard-coding
raw ids; keep identifiers id_up, id_down, test_registry, and any bindings
(bind_polymarket_current / insert) as reference points when making the change.
- Around line 171-180: The assertions use a wildcard fallback arm which hides
the actual unexpected variant; change the match arms on out[0].body and
out[2].body to bind the unexpected variant (e.g., name it `other`) instead of
`_` and include that bound value in the panic message so failures display what
was returned; keep the successful arm as `EventBody::BookDelta(bd)` and continue
to assert `bd.side == Side::Bid`.
In `@crates/transport/src/timer.rs`:
- Around line 299-301: The matches! wildcards should be replaced with explicit
destructuring: change uses of matches!(e.body, EventBody::Timer(_)) and
matches!(e.body, EventBody::Heartbeat(_)) to pattern-match the inner binding
(e.g., match e.body { EventBody::Timer(timer) => true, _ => false } or use if
let EventBody::Timer(timer) = &e.body { … }) so the inner value is explicitly
named; update both occurrences referencing EventBody::Timer and
EventBody::Heartbeat and replace the anonymous `_` with a concrete binding name
(e.g., timer, hb) wherever e.body is tested in iterator/filter chains.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: cd81f372-84c7-4651-92ff-950025faed37
📒 Files selected for processing (9)
crates/binance/README.mdcrates/binance/src/decoder.rscrates/binance/tests/e2e_decode.rscrates/fixed/src/parse.rscrates/polymarket/src/market/decoder.rscrates/polymarket/tests/e2e_decode.rscrates/transport/README.mdcrates/transport/src/timer.rscrates/types/src/timestamp.rs
✅ Files skipped from review due to trivial changes (1)
- crates/binance/tests/e2e_decode.rs
🚧 Files skipped from review as they are similar to previous changes (2)
- crates/types/src/timestamp.rs
- crates/polymarket/src/market/decoder.rs
…g zeros - Increment event_count on every callback invocation (not just successful decodes) so FeedMonitor tracks feed liveness for non-hot message types in both Polymarket and Binance spawn modules. - Sync BinanceDecoder combined_stream mode from config.symbols.len() in spawn_binance_feed to prevent stream format mismatch when mapping count differs from symbol count. - Skip leading zeros in parse_decimal_bytes accumulator so inputs like "00000000000000000001" are not falsely rejected as overflow. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Add SymbolTooLong to DecoderError docs in binance README - Fix timer example callback return type (unit, not bool) - Add text language tag to transport README code fence Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
@coderabbitai New comments addressed in commits 8c8c67f and 1366fe3:
|
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 1469
Everything is in order. I'll go ahead and resolve the remaining open comments. [resolve] (∩`-´)⊃━☆゚.*・。゚ 🧠 Learnings used |
Fixes rustdoc error: public doc for SymbolTooLong links to private item. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Increased intervals (50ms tick, 200ms heartbeat) and sleep time (1500ms) to handle slow CI runners. Reduced assertion to >= 1. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (1)
crates/fixed/src/parse.rs (1)
200-208: Uselet...elsefor thechecked_negearly return.This branch is a straight early-return case;
let...elsekeeps the happy path flat and matches the style already used forchecked_mulbelow.♻️ Proposed refactor
let signed_mantissa = if negative { // Already negative — keep as-is acc.mantissa } else { // Negate to get positive value - match acc.mantissa.checked_neg() { - Some(v) => v, - None => return Err(ParseFixedError::Overflow), - } + let Some(v) = acc.mantissa.checked_neg() else { + return Err(ParseFixedError::Overflow); + }; + v };As per coding guidelines,
**/*.rs: Uselet...elsefor early returns, keep happy path unindented.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/fixed/src/parse.rs` around lines 200 - 208, Replace the match-based early return in the negative branch that handles acc.mantissa.checked_neg() with a let...else pattern to keep the happy path unindented: inside the negative branch of the signed_mantissa assignment, write let v = acc.mantissa.checked_neg() else { return Err(ParseFixedError::Overflow); } and then use v as the negated value (preserving the current semantics of signed_mantissa and acc.mantissa).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@crates/fixed/src/parse.rs`:
- Around line 77-82: The parser increments the u8 field `self.frac_digits`
unguarded (e.g. inside the branch when `self.saw_dot`) which can overflow for
long fractional sequences before the later `ExcessPrecision` check; change the
increments to a saturating increment (e.g. `self.frac_digits =
self.frac_digits.saturating_add(1)` or `if self.frac_digits != u8::MAX {
self.frac_digits += 1 }`) so the counter never wraps/panics, and apply this same
guarded/saturating increment at the other occurrences noted (the other
`self.saw_dot`/`frac_digits` increments around the lines referenced). Ensure
subsequent precision checks treat the saturated value correctly.
In `@crates/polymarket/src/market/spawn.rs`:
- Around line 18-19: Doc comment for event_count is inaccurate: it says
"successfully decoded events" but the code increments event_count on every
received message/callback. Either update the doc on the struct field event_count
to say it counts received messages/callback invocations (used for FeedMonitor
registration), or if the original intent was to count only successful decodes,
move the increment out of the generic receive/callback path and increment
event_count only after a successful decode/parse result (the decode success path
in the message handling function). Refer to the field event_count
(Arc<AtomicU64>) and the receive/callback handler where the counter is currently
incremented to make the corresponding change.
In `@crates/transport/README.md`:
- Around line 109-111: The README example incorrectly treats
FeedMonitor::check_all() as an iterator; instead update the example to capture
its usize return value (e.g., let stale_count = monitor.check_all();) and use
that count, or if you intended to iterate stale entries, call the correct
iterator method (e.g., FeedMonitor::stale_iter or check_stale() if present) —
replace the for-loop example with a code snippet that either uses the returned
usize from check_all() or the actual method that returns iterable stale entries.
---
Nitpick comments:
In `@crates/fixed/src/parse.rs`:
- Around line 200-208: Replace the match-based early return in the negative
branch that handles acc.mantissa.checked_neg() with a let...else pattern to keep
the happy path unindented: inside the negative branch of the signed_mantissa
assignment, write let v = acc.mantissa.checked_neg() else { return
Err(ParseFixedError::Overflow); } and then use v as the negated value
(preserving the current semantics of signed_mantissa and acc.mantissa).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 6997a338-44c0-4ff2-95fe-487cdfb61133
📒 Files selected for processing (7)
crates/binance/README.mdcrates/binance/src/decoder.rscrates/binance/src/spawn.rscrates/fixed/src/parse.rscrates/polymarket/src/market/spawn.rscrates/transport/README.mdcrates/transport/src/timer.rs
✅ Files skipped from review due to trivial changes (1)
- crates/binance/README.md
🚧 Files skipped from review as they are similar to previous changes (2)
- crates/binance/src/spawn.rs
- crates/transport/src/timer.rs
All timer tests now use 50ms+ tick intervals and 1000-1500ms sleep to handle slow CI runners (macOS GitHub Actions VMs). Previous 10ms intervals caused flaky failures. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Saturating increment for frac_digits (u8 overflow on 256+ frac zeros) - Fix event_count doc: "callback invocations" not "decoded events" - Fix FeedMonitor README example: check_all() returns usize, not iterator - Fix Duration::from_millis(1000) → from_secs(1) clippy lint Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
@coderabbitai Round 3 addressed in 623cf93:
|
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 2987
All looks solid. Resolving the open comments now. [resolve] (∩`-´)⊃━☆゚.*・。゚ 🧠 Learnings used |
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
crates/fixed/src/parse.rs (1)
153-156:⚠️ Potential issue | 🔴 CriticalAdd an early guard to enforce
D <= 18inparse_decimal_bytes.The method currently bypasses the
_BOUND_CHECKby not accessingSelf::SCALE. Since it constructs values viaSelf::from_raw(...), a caller can invokeFixedI64::<19>::parse_decimal_bytes(...)and construct an unsupported scale factor. Thepad > 18check at line 193 only prevents out-of-boundsPOW10_I64indexing—it does not validate the type parameterDitself.To fix: either add an explicit early return on
D > 18, or accessSelf::SCALE(e.g., in a const initialization) to trigger the compile-time assertion and make the constraint part of the method's implicit precondition.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/fixed/src/parse.rs` around lines 153 - 156, The parse_decimal_bytes method can be used to construct values with an out-of-range scale because it never triggers the compile-time _BOUND_CHECK; add an early guard to enforce D <= 18 by either returning Err(ParseFixedError::InvalidFormat) when Self::D (or the const generic D) is greater than 18 or by referencing Self::SCALE (e.g., read it into a local const) near the top of parse_decimal_bytes to force the compile-time assertion; ensure the check happens before calling Self::from_raw(...) so callers like FixedI64::<19>::parse_decimal_bytes(...) cannot bypass the constraint and keep the existing pad/POW10_I64 bounds checks intact.
🧹 Nitpick comments (1)
crates/polymarket/src/market/spawn.rs (1)
129-160: Add one callback test for the zero-decode admin path.This file covers hot-message delivery and push rejection, but not the intentional
decode() == 0case. Atick_size_changecallback test would lock in thatevent_countstill advances while noHotEventis emitted.Based on learnings: In
crates/polymarket/src/market/decoder.rs, thetick_size_changePolymarket WebSocket message type is intentionally NOT handled byPolymarketMarketDecoder.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@crates/polymarket/src/market/spawn.rs` around lines 129 - 160, Add a new unit test that exercises the "zero-decode admin path" by using PolymarketMarketDecoder and build_callback but passing a JSON payload with "type":"tick_size_change" (the message type intentionally not handled by PolymarketMarketDecoder) and a consumer closure that returns true/does nothing; verify that the event counter (event_count from build_callback) increments while no HotEvent is emitted: assert push_count remains 0, event_count increments to 1, and drop_count stays 0. Reference PolymarketMarketDecoder, build_callback, and the existing tests callback_decodes_and_pushes / callback_counts_drops to mirror setup and assertions.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@crates/fixed/src/parse.rs`:
- Around line 153-156: The parse_decimal_bytes method can be used to construct
values with an out-of-range scale because it never triggers the compile-time
_BOUND_CHECK; add an early guard to enforce D <= 18 by either returning
Err(ParseFixedError::InvalidFormat) when Self::D (or the const generic D) is
greater than 18 or by referencing Self::SCALE (e.g., read it into a local const)
near the top of parse_decimal_bytes to force the compile-time assertion; ensure
the check happens before calling Self::from_raw(...) so callers like
FixedI64::<19>::parse_decimal_bytes(...) cannot bypass the constraint and keep
the existing pad/POW10_I64 bounds checks intact.
---
Nitpick comments:
In `@crates/polymarket/src/market/spawn.rs`:
- Around line 129-160: Add a new unit test that exercises the "zero-decode admin
path" by using PolymarketMarketDecoder and build_callback but passing a JSON
payload with "type":"tick_size_change" (the message type intentionally not
handled by PolymarketMarketDecoder) and a consumer closure that returns
true/does nothing; verify that the event counter (event_count from
build_callback) increments while no HotEvent is emitted: assert push_count
remains 0, event_count increments to 1, and drop_count stays 0. Reference
PolymarketMarketDecoder, build_callback, and the existing tests
callback_decodes_and_pushes / callback_counts_drops to mirror setup and
assertions.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: a051c395-e60b-4254-8978-2a0fa2787ae0
📒 Files selected for processing (5)
crates/binance/src/spawn.rscrates/fixed/src/parse.rscrates/polymarket/src/market/spawn.rscrates/transport/README.mdcrates/transport/src/timer.rs
🚧 Files skipped from review as they are similar to previous changes (1)
- crates/transport/src/timer.rs
| while i < bytes.len() { | ||
| acc = match acc.feed(bytes[i]) { | ||
| Ok(a) => a, | ||
| Err(e) => return Err(e), | ||
| }; | ||
| i += 1; |
There was a problem hiding this comment.
Return ExcessPrecision before the 20-digit overflow path.
FixedI64::<18>::parse_decimal_bytes(b"1.0000000000000000000") currently fails with Overflow, because the extra fractional digit is fed into total_digits before Line 186 runs. That input is out of scale, not out of range, so callers should still get ExcessPrecision.
🛠️ Suggested fix
let mut acc = Accumulator::new();
let mut i = start;
while i < bytes.len() {
- acc = match acc.feed(bytes[i]) {
+ let byte = bytes[i];
+ if acc.saw_dot && acc.frac_digits >= D && byte >= b'0' && byte <= b'9' {
+ return Err(ParseFixedError::ExcessPrecision);
+ }
+ acc = match acc.feed(byte) {
Ok(a) => a,
Err(e) => return Err(e),
};
i += 1;
}Also applies to: 185-187
Cover Display/Error impls for TimerConfigError, TimerSpawnError, and MonitorFullError. Test TimerThread Drop path, heartbeat-too-large validation, make_out() buffer size, and book snapshot truncation at 64 levels in the Polymarket decoder. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Summary
spawn_binance_feed/spawn_polymarket_market_feedwithFnMut(HotEvent) -> boolbackpressure andFeedSpawnResult(event_count + drop_count)peek_streamTimer(Periodic)+Heartbeatevents with config validation andDropimplparse_decimal_bytes: single-pass, i64-only, POW10 lookup (eliminated i128 widening + triple-scan)Timestamp::now(): std-gated wall-clock accessBenchmarks (Linux isolated core, criterion, target-cpu=native)
Perf profile (after optimization)
parse_decimal_bytesreduced from 20% → 10.5% of total decode timepeek_streameliminated for single-symbol feeds (was 16%)Test plan
cargo test --features alloc,stdgreenReview
Spec and plan reviewed by GPT-5.4 (gpt-5.4 via Codex CLI). All high/medium issues addressed.
🤖 Generated with Claude Code
Summary by CodeRabbit
New Features
Improvements
Tests & Quality