You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Device represents an `OAK camera <https://docs.luxonis.com/projects/hardware/en/latest/>`__. On all of our devices there's a powerful Robotics Vision Core
7
-
(`RVC <https://docs.luxonis.com/projects/hardware/en/latest/pages/rvc/rvc2.html#rvc2>`__). The RVC is optimized for performing AI inference algorithms and
8
-
for processing sensory inputs (eg. calculating stereo disparity from two cameras).
6
+
Device is an `OAK camera <https://docs.luxonis.com/projects/hardware/en/latest/>`__ or a RAE robot. On all of our devices there's a powerful Robotics Vision Core
7
+
(`RVC <https://docs.luxonis.com/projects/hardware/en/latest/pages/rvc/rvc2.html#rvc2>`__). The RVC is optimized for performing AI inference, CV operations, and
8
+
for processing sensory inputs (eg. stereo depth, video encoders, etc.).
9
9
10
10
Device API
11
11
##########
@@ -55,6 +55,36 @@ subnet, you can specify the device (either with MxID, IP, or USB port name) you
55
55
with depthai.Device(pipeline, device_info) as device:
56
56
# ...
57
57
58
+
Host clock syncing
59
+
==================
60
+
61
+
When depthai library connects to a device, it automatically syncs device's timestamp to host's timestamp. Timestamp syncing happens continuously at around 5 second intervals,
62
+
and can be configured via API (example script below).
A graph representing the accuracy of the device clock with respect to the host clock. We had 3 devices connected (OAK PoE cameras), all were hardware synchronized using `FSYNC Y-adapter <https://docs.luxonis.com/projects/hardware/en/latest/pages/FSYNC_Yadapter/>`__.
71
+
Raspberry Pi (the host) had an interrupt pin connected to the FSYNC line, so at the start of each frame the interrupt happened and the host clock was recorded. Then we compared frame (synced) timestamps with
72
+
host timestamps and computed the standard deviation. For the histogram above we ran this test for approximately 3 hours.
73
+
74
+
.. code-block:: python
75
+
76
+
# Configure host clock syncing example
77
+
78
+
import depthai as dai
79
+
from datetime import timedelta
80
+
# Configure pipeline
81
+
with dai.Device(pipeline) as device:
82
+
# 1st value: Interval between timesync runs
83
+
# 2nd value: Number of timesync samples per run which are used to compute a better value
84
+
# 3rd value: If true partial timesync requests will be performed at random intervals, otherwise at fixed intervals
85
+
device.setTimesync(timedelta(seconds=5), 10, True) # (These are default values)
86
+
87
+
58
88
Multiple devices
59
89
################
60
90
@@ -63,92 +93,163 @@ If you want to use multiple devices on a host, check :ref:`Multiple DepthAI per
63
93
Device queues
64
94
#############
65
95
66
-
After initializing the device, one has to initialize the input/output queues as well. These queues will be located on the host computer (in RAM).
96
+
After initializing the device, you can create input/output queues that match :ref:`XLinkIn`/:ref:`XLinkOut` nodes in the pipeline. These queues will be located on the host computer (in RAM).
When obtaining the output queue (example code below), the :code:`maxSize` and :code:`blocking` arguments should be set depending on how
92
-
the messages are intended to be used, where :code:`name` is the name of the outputting stream.
121
+
Output queue - `maxSize` and `blocking`
122
+
#######################################
93
123
94
-
Since queues are on the host computer, memory (RAM) usually isn't that scarce. But if you are using a small SBC like RPI Zero, where there's only 0.5GB RAM,
95
-
you might need to specify max queue size as well.
124
+
When the host is reading very fast from the queue (inside `while True` loop), the queue, regardless of its size, will stay empty most of
125
+
the time. But as we add things on the host side (additional processing, analysis, etc), it may happen that the device will be pushing messages to
126
+
the queue faster than the host can read from it. And then the messages in the queue will start to increase - and both `maxSize` and `blocking`
127
+
flags determine the behavior of the queue in this case. Two common configurations are:
# If you care about every single message (eg. H264/5 encoded video; if you miss a frame, you will get artifacts);
138
+
# If the queue is full, the device will wait until the host reads a message from the queue
139
+
q2 = device.getOutputQueue(name="name2", maxSize=30, blocking=True) # Also default values (maxSize=30/blocking=True)
140
+
141
+
We used `maxSize=30` just as an example, but it can be any `int16` number. Since device queues are on the host computer, memory (RAM) usually isn't that scarce, so `maxSize` wouldn't matter that much.
142
+
But if you are using a small SBC like RPI Zero (512MB RAM), and are streaming large frames (eg. 4K unencoded), you could quickly run out of memory if you set `maxSize` to a high
143
+
value (and don't read from the queue fast enough).
143
144
144
145
Some additional information
145
-
***************************
146
+
---------------------------
146
147
147
-
- Decreasing the queue size to 1 and setting non-blocking behavior will effectively mean "I only want the latest packet from the queue".
148
148
- Queues are thread-safe - they can be accessed from any thread.
149
149
- Queues are created such that each queue is its own thread which takes care of receiving, serializing/deserializing, and sending the messages forward (same for input/output queues).
150
150
- The :code:`Device` object isn't fully thread-safe. Some RPC calls (eg. :code:`getLogLevel`, :code:`setLogLevel`, :code:`getDdrMemoryUsage`) will get thread-safe once the mutex is set in place (right now there could be races).
151
151
152
+
Watchdog
153
+
########
154
+
155
+
The watchdog is a crucial component in the operation of POE (Power over Ethernet) devices with DepthAI. When DepthAI disconnects from a POE device, the watchdog mechanism is the first to respond,
156
+
initiating a reset of the camera. This reset is followed by a complete system reboot, which includes the loading of the DepthAI bootloader and the initialization of the entire networking stack.
157
+
158
+
The watchdog process is necessary to make the camera available for reconnection and **typically takes about 10 seconds**, which means the fastest possible reconnection time is 10 seconds.
159
+
160
+
161
+
Customizing the Watchdog Timeout
162
+
--------------------------------
163
+
164
+
.. tabs::
165
+
166
+
.. tab:: **Linux/MacOS**
167
+
168
+
Set the environment variables `DEPTHAI_WATCHDOG_INITIAL_DELAY` and `DEPTHAI_BOOTUP_TIMEOUT` to your desired timeout values (in milliseconds) as follows:
Copy file name to clipboardExpand all lines: docs/source/components/nodes/script.rst
+25Lines changed: 25 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -138,6 +138,31 @@ GPIO 40 drives FSYNC signal for both 4-lane cameras, and we have used the code b
138
138
toggleVal =not toggleVal
139
139
ret =GPIO.write(MX_PIN, toggleVal) # Toggle the GPIO
140
140
141
+
Time synchronization
142
+
####################
143
+
144
+
Script node has access to both device (internal) clock and also synchronized host clock. Host clock is synchronized with device clock at below 2.5ms precision at 1σ, :ref:`more information here <Host clock syncing>`.
145
+
146
+
.. code-block:: python
147
+
148
+
import time
149
+
interval =60
150
+
ctrl = CameraControl()
151
+
ctrl.setCaptureStill(True)
152
+
previous =0
153
+
whileTrue:
154
+
time.sleep(0.001)
155
+
156
+
tnow_full = Clock.nowHost() # Synced clock with host
157
+
# Clock.now() -> internal/device clock
158
+
# Clock.offsetToHost() -> Offset between internal/device clock and host clock
159
+
160
+
now = tnow_full.seconds
161
+
if now % interval ==0and now != previous:
162
+
previous = now
163
+
node.warn(f'{tnow_full}')
164
+
node.io['out'].send(ctrl)
165
+
141
166
Using DepthAI :ref:`Messages <components_messages>`
The DepthAI UVC (`USB Video Class <https://en.wikipedia.org/wiki/USB_video_device_class>`__) node allows OAK devices to function as standard webcams. This feature is particularly useful for integrating OAK devices into applications that require video input, such as video conferencing tools or custom video processing applications.
5
+
6
+
What is UVC?
7
+
############
8
+
9
+
UVC refers to the USB Video Class standard, which is a USB device class that describes devices capable of streaming video. This standard allows video devices to interface with computers and other devices without needing specific drivers, making them immediately compatible with a wide range of systems and software.
10
+
11
+
How Does the UVC Node Work?
12
+
###########################
13
+
14
+
The UVC node in DepthAI leverages this standard to stream video from OAK devices. When the UVC node is enabled, the OAK device is recognized as a standard webcam by the host system. This allows the device to be used in any application that supports webcam input, such as Zoom, Skype, or custom video processing software.
15
+
16
+
The UVC node streams video data over a USB connection. It is important to use a USB3 cable for this purpose, as USB2 may not provide the necessary bandwidth for stable video streaming.
17
+
18
+
.. note::
19
+
20
+
The UVC node can currently handle NV12 video streams from OAK devices. For streams in other formats, conversion to NV12 is necessary, which can be achieved using the :ref:`ImageManip` node. It's important to note that streams incompatible with NV12 conversion, like depth streams, are not supported by the UVC node.
21
+
22
+
Examples of UVC Node Usage
23
+
##########################
24
+
25
+
1. **DepthAI Demo Script**: The DepthAI demo script includes a UVC application that can be run to enable the UVC node on an OAK device.
26
+
27
+
.. code-block:: bash
28
+
29
+
python3 depthai_demo.py --app uvc
30
+
31
+
2. **Custom Python Script**: A custom Python script can be written to enable the UVC node and configure the video stream parameters. Here are some pre-written examples:
32
+
33
+
- :ref:`UVC & Color Camera`
34
+
- :ref:`UVC & Mono Camera`
35
+
- :ref:`UVC & Disparity`
36
+
37
+
38
+
3. **OBS Forwarding**: For applications where direct UVC node usage is not possible, OBS Studio can be used to forward the UVC stream.
0 commit comments