feat(sources/ljm): Create Data Source for LabJack LJM#448
Conversation
WalkthroughAdds a new LabJack LJM DAQ source: workspace registration and project metadata, a calibration subsystem (linear and thermistor), example config and .gitignore update, a streaming main loop that reads from a T7, applies calibrations, logs and sends messages via Omnibus, plus tests and mocks. Changes
Sequence Diagram(s)sequenceDiagram
participant Main as Main Process (ljm/main.py)
participant Device as LabJack T7 (ljm)
participant Sensor as Sensor Manager (calibration.Sensor)
participant Calibration as Calibration Engine (calibration.*)
participant Omnibus as Omnibus Sender
participant Log as Log File
Main->>Device: openS() → handle
Main->>Sensor: setup(handle)
Sensor->>Device: set RANGE / NEGATIVE_CH per channel
Device-->>Sensor: ack
Sensor-->>Main: a_scan_list, num_addresses
Main->>Device: eStreamStart(scan_rate, a_scan_list)
Device-->>Main: streaming started
loop streaming read
Main->>Device: eStreamRead(num_addresses, scans_per_read)
Device-->>Main: aData (interleaved), backlog
Main->>Sensor: parse(interleaved samples)
Sensor->>Calibration: calibrate per sensor
Calibration-->>Sensor: calibrated values
Sensor-->>Main: data_parsed (dict with units)
Main->>Log: write(msgpack.packb(payload))
Log-->>Main: flushed
Main->>Omnibus: send(data_parsed)
Omnibus-->>Main: sent
end
Main->>Device: eStreamStop()
Main->>Device: close()
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 9
🤖 Fix all issues with AI agents
In `@src/sources/ljm/calibration_test.py`:
- Around line 42-49: The test TestSensorParser.test_parsing instantiates a
Sensor which appends to the global Sensor.sensors list and never removes it,
leaking state; update the test to restore/clear Sensor.sensors after use (or
switch to a pytest fixture/teardown) so other tests aren't polluted — e.g.,
record the original Sensor.sensors, run the test creating
Sensor("Test"...)/Sensor.parse(...), then restore Sensor.sensors to its original
value (or call Sensor.sensors.clear()) in a finally/teardown block; reference
Sensor, Sensor.sensors and TestSensorParser.test_parsing when applying the
change.
In `@src/sources/ljm/calibration.py`:
- Around line 96-101: The calibrate method can divide by zero when value == 0;
add an early guard in calibrate to handle zero (or falsy) sensor readings before
computing R_therm: if value is 0 (or nearly 0) return the same safe fallback as
when R_therm <= 0 (e.g. 0) to avoid the ZeroDivisionError; then compute R_therm
= (self.voltage - value) / (value / self.resistance) and continue as before
using self.voltage, self.resistance, self.B, self.r_inf and the existing R_therm
<= 0 check.
- Around line 109-110: Sensor.sensors is a mutable class-level list that
produces shared state across tests; change it to avoid accumulation by either
making it an instance attribute or adding a class-level reset/clear API: add a
classmethod like reset() or clear_sensors() on class Sensor that sets
Sensor.sensors = [] and call it from tests/teardown, or refactor Sensor to
initialize self.sensors in __init__ and remove the ClassVar declaration; update
any code referencing Sensor.sensors to use the chosen API
(Sensor.reset()/Sensor.clear_sensors() or instance attribute) so tests no longer
share state.
In `@src/sources/ljm/main.py`:
- Around line 282-286: The cleanup code after starting the LJM stream can raise
NameError if ljm.openS() fails because handle won't be defined; modify the
function to ensure cleanup runs safely by moving the stream stop/close into a
finally block or guarding it with a check for a defined/valid handle and
stream_info (reference ljm.openS, handle, stream_info.done, ljm.eStreamStop,
ljm.close); ensure you set stream_info.done = True and only call
ljm.eStreamStop(handle) and ljm.close(handle) when handle exists and is valid to
avoid unhandled NameError during exception paths.
- Around line 196-199: The print call computes Rate as config.SCANS_PER_READ *
len(rates) / (time.time() - rates[0]) which can divide by zero if time.time() ==
rates[0]; fix by computing a safe elapsed = time.time() - rates[0], check if
elapsed <= 0 (or below a tiny epsilon) and choose a fallback displayed_rate
(e.g., 0 or float('inf')) when elapsed is not positive, otherwise compute
displayed_rate = config.SCANS_PER_READ * len(rates) / elapsed, and then use
displayed_rate in the existing print call that references rates and
config.SCANS_PER_READ.
In `@src/sources/ljm/test.py`:
- Around line 9-11: The test's setUp creates self.mock_ljm but never injects it
into the module under test; replace direct MagicMock creation with a
unittest.mock.patch on main.ljm so calls in main.py use the mock (e.g., create
and start a patcher in setUp and assign self.mock_ljm to the started patch, and
stop the patcher in tearDown). Update setUp/tearDown to use a patcher attribute
(e.g., self.ljm_patcher) that patches 'main.ljm', start it to set self.mock_ljm,
and stop it in tearDown to restore the real dependency.
- Around line 27-41: The test currently never calls the code under test; replace
the hardcoded result assignment in test_callback_reads_data with an invocation
of the actual callback so eStreamRead is exercised: call
ljm_stream_read_callback (from main.py) inside test_callback_reads_data while
the mock object self.mock_ljm is patched/installed so that
self.mock_ljm.eStreamRead is invoked, then assert the returned/side-effect
values and keep the existing
self.mock_ljm.eStreamRead.assert_called_once_with(1) check; reference the test
method test_callback_reads_data and the function ljm_stream_read_callback and
ensure the mock object name self.mock_ljm is used in the call context.
- Around line 1-4: The test file currently imports the main module but never
uses it; either remove the unused import or exercise the module under test by
calling its entry point or specific functions (e.g., call main.main() or import
and invoke target functions from main) and add assertions—update
src/sources/ljm/test.py to either delete the unused "import main" or replace it
with explicit calls to main.<symbol>main</symbol> (or targeted functions within
main) and corresponding assertions so the tests actually exercise code paths.
- Around line 13-25: The test test_basic_stream_start currently assigns handle =
1 instead of invoking the code that should call ljm APIs; replace that with
patching main.ljm to use self.mock_ljm, call the actual entry under test (e.g.,
main.main() or the specific helper that starts the stream), capture/return the
handle if needed, then assert interactions against self.mock_ljm.openS,
self.mock_ljm.setStreamCallback, and self.mock_ljm.eStreamStart and the returned
handle; ensure you reference and patch main.ljm before calling main.main() (or
the stream-start helper) so the mock methods are invoked during the Act phase.
🧹 Nitpick comments (6)
src/sources/ljm/pyproject.toml (1)
5-5: Python version constraint is inconsistent with root project.The root
pyproject.tomlspecifiesrequires-python = "~=3.11.0"(Python 3.11.x only), but this package uses>=3.11(Python 3.11+). This inconsistency could cause dependency resolution issues.Consider aligning with the root constraint:
Proposed fix
-requires-python = ">=3.11" +requires-python = "~=3.11.0"src/sources/ljm/calibration.py (2)
7-7: Unused import.
PackExceptionis imported but never used in this module. Remove it to avoid confusion.Proposed fix
from labjack import ljm -from msgpack.exceptions import PackException
118-121: Docstring placement is unconventional.The class docstring is placed after the class attributes. By convention, docstrings should immediately follow the
classdeclaration.Proposed fix
class Sensor: + """ + Represents a sensor plugged into the LabJack box. + Instantiating members of this class sets up the sensors used with the static methods. + """ + sensors: ClassVar[list[Self]] = [] name: str channel: str input_range: float | int connection: Connection calibration: Calibration - - """ - Represents a sensor plugged into the LabJack box. - Instantiating members of this class sets up the sensors used with the static methods. - """src/sources/ljm/main.py (3)
37-41: Module-level side effects hinder testability.
Sensor.print()andSender()execute at import time, causing side effects (console output, network socket creation) whenever this module is imported. This makes the module difficult to test in isolation.Consider deferring these to
main()or guarding them withif __name__ == "__main__":.
131-133: Usecollections.dequeinstead of list for sliding window.
rates.pop(0)is O(n) for lists. For a fixed-size sliding window, usedeque(maxlen=50)which provides O(1) append and automatic eviction.Proposed fix
+from collections import deque + # At module level or in StreamInfo: -rates = [] +rates: deque[float] = deque(maxlen=50) # In the callback: rates.append(time.time()) -if len(rates) > 50: - rates.pop(0) +# No need to pop - deque handles it automatically
190-191: File I/O in callback may block streaming.Opening and writing to a file inside the stream callback can introduce latency and potentially cause buffer overruns if the filesystem is slow. Consider buffering writes or using a separate thread for logging.
ChrisYx511
left a comment
There was a problem hiding this comment.
i'm lowkey not convinced that we need to do an async implementation here. What was wrong with just doing while True: ljm.eStreamRead? Also I'm not sure that the eStreamRead goes in the callback when doing async. My suggestion would be to just keep it synchronous unless you have a good justification to make it async. I can forsee ordering of messages and various debugging difficulties with async implementation.
It'd be great if you elaborated!
Yeah I would also switch back to synchronous (am working on it) as problems associated with async appeared. The async approach was chosen at the beginning because I believe it separate well the setup v.s. reading/processing parts of the codebase, and from LJM's sample, it does the job well enough, and I thought I had more time to deal with async issues. The code is correct based on LJM samples (on if eStreamRead should be inside the callback), but yeah it's going back to using |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@src/sources/ljm/calibration.py`:
- Around line 155-165: The loop over Sensor.sensors directly indexes
CHANNEL_TO_NEGATIVE_CHANNEL which can raise a KeyError for typos or unsupported
channels; update the DIFFERENTIAL branch in the calibration loop to check
membership (e.g., "if sensor.channel not in CHANNEL_TO_NEGATIVE_CHANNEL") or use
CHANNEL_TO_NEGATIVE_CHANNEL.get(sensor.channel) and if missing raise a clear
ValueError mentioning the sensor.channel, then proceed to set negChannelValue
and call ljm.eWriteName as before (references: Sensor.sensors,
Connection.DIFFERENTIAL, CHANNEL_TO_NEGATIVE_CHANNEL, ljm.eWriteName,
negChannelValue).
- Around line 181-189: The parse function assumes sensor_values aligns with
Sensor.sensors; add an explicit guard at the start of parse to detect length
mismatches and fail fast: if len(sensor_values) != len(Sensor.sensors) raise a
ValueError with a clear message including both lengths, or (alternatively)
iterate only up to min_len = min(len(sensor_values), len(Sensor.sensors)) and
document that behavior; update the loop that uses Sensor.sensors and
sensor_values[i] (and calls sensor.calibration.calibrate) to use the chosen safe
approach so no IndexError can occur.
🧹 Nitpick comments (2)
src/sources/ljm/pyproject.toml (1)
6-6: Pinlabjack-ljmto a tested version to avoid API drift.The package is currently unbounded and may break at runtime when the upstream package is upgraded. The latest stable version on PyPI is 1.23.0 (released December 2023). For reproducible builds, pin to a specific tested version using either
labjack-ljm==1.23.0for maximum stability, orlabjack-ljm>=1.23.0,<2to allow future compatible updates.✅ Suggested update
-dependencies = ["omnibus", "labjack-ljm"] +dependencies = ["omnibus", "labjack-ljm>=1.23.0,<2"]src/sources/ljm/main.py (1)
45-48: Aligndatatype withSensor.parseoutput.
Sensor.parsereturnsdict[str, list[float | int]], but the TypedDict currently declareslist[float], which can trip type-checkers.♻️ Suggested type adjustment
class DAQ_SEND_MESSAGE_TYPE(TypedDict): timestamp: float - data: dict[str, list[float]] + data: dict[str, list[float | int]]
ChrisYx511
left a comment
There was a problem hiding this comment.
generally looks good, can merge if you have confidently tested for now!
|
Test with LJM hardware is done today. Works as expected. Will merge this. Unit and integration test, with a better config template will be added in later PR. |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@src/sources/ljm/main.py`:
- Line 87: The variable relative_last_read_time is incorrectly annotated as
float while time.time_ns() returns an int and it's later used in range(), so
change its type annotation from float to int (i.e., update the declaration of
relative_last_read_time to be of type int) and ensure any downstream uses
expecting an int (such as the range(...) call) remain compatible; do not cast
where unnecessary—just make the annotation match the actual int value returned
by time.time_ns().
In `@src/sources/ljm/pyproject.toml`:
- Line 5: Update the Python version constraint in the
src/sources/ljm/pyproject.toml by changing the requires-python value to match
the root project's constraint ("~=3.11.0") so both pyproject files use the exact
same supported Python range; modify the requires-python entry in this file (the
requires-python key) to the root project's "~=3.11.0" string to ensure
consistent dependency resolution.
🧹 Nitpick comments (1)
src/sources/ljm/calibration.py (1)
118-121: Class docstring is misplaced.The docstring is positioned after the class attribute declarations rather than immediately after
class Sensor:. Python will not recognize this as the class's__doc__attribute, soSensor.__doc__will beNone.Suggested fix: Move docstring to correct position
class Sensor: + """ + Represents a sensor plugged into the LabJack box. + Instantiating members of this class sets up the sensors used with the static methods. + """ + sensors: ClassVar[list[Self]] = [] name: str channel: str input_range: float | int connection: Connection calibration: Calibration - """ - Represents a sensor plugged into the LabJack box. - Instantiating members of this class sets up the sensors used with the static methods. - """ - def __init__(
Signed-off-by: Qian Qian "Qubik" <qubik65536@qubik65536.top>
Signed-off-by: Qian Qian "Qubik" <qubik65536@qubik65536.top>
Signed-off-by: Qian Qian "Qubik" <qubik65536@qubik65536.top>
Signed-off-by: Qian Qian "Qubik" <qubik65536@qubik65536.top>
Signed-off-by: Qian Qian "Qubik" <qubik65536@qubik65536.top>
Signed-off-by: Qian Qian "Qubik" <qubik65536@qubik65536.top>
4f8738b to
193a4b5
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@src/sources/ljm/main.py`:
- Around line 45-74: The DAQ_SEND_MESSAGE_TYPE TypedDict narrows data values to
list[float] while calibration.Sensor.parse returns list[float | int]; update the
TypedDict entry DAQ_SEND_MESSAGE_TYPE.data to accept list[float | int] (or a
numeric union type) so the declared message shape matches
calibration.Sensor.parse output and removes type-checker friction when using
parsed calibration values.
…nflict. Signed-off-by: Qian Qian "Qubik" <qubik65536@qubik65536.top>
Description
A new Omnibus source to read from new LabJack T7 DAQ box is added. The new source is designed to be a drop-in replacement of the NI source. The message format for Omnibus and the logging behavior is kept as-is, and minimal changes were made to the sensor configuration to adapt to the new LJM system.
This issue is related to:
Developer Testing
Here's what I did to test my changes:
Reviewer Testing
Here's what you should do to quickly validate my changes:
This change is
Summary by CodeRabbit
New Features
Tests
Chores
✏️ Tip: You can customize this high-level summary in your review settings.