You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Simple example of using a MIPI-CSI(2) Camera (like the Raspberry Pi Version 2 camera) with the NVIDIA Jetson Nano Developer Kit. This is support code for the article on JetsonHacks: https://wp.me/p7ZgI9-19v
2
+
Simple example of using a MIPI-CSI(2) Camera (like the Raspberry Pi Version 2 camera) with the NVIDIA Jetson Developer Kits with CSI camera ports. This includes the recent Jetson Nano and Jetson Xavier NX. This is support code for the article on JetsonHacks: https://wp.me/p7ZgI9-19v
3
3
4
-
The camera should be installed in the MIPI-CSI Camera Connector on the carrier board. The pins on the camera ribbon should face the Jetson Nano module, the stripe faces outward.
4
+
For the Nanos and Xavier NX, the camera should be installed in the MIPI-CSI Camera Connector on the carrier board. The pins on the camera ribbon should face the Jetson module, the tape stripe faces outward.
5
5
6
-
The new Jetson Nano B01 developer kit has two CSI camera slots. You can use the sensor_mode attribute with nvarguscamerasrc to specify the camera. Valid values are 0 or 1 (the default is 0 if not specified), i.e.
6
+
Some Jetson developer kits have two CSI camera slots. You can use the sensor_mode attribute with the GStreamer nvarguscamerasrc element to specify which camera. Valid values are 0 or 1 (the default is 0 if not specified), i.e.
Note: The cameras appear to report differently than show below on some Jetsons. You can use the simple gst-launch example above to determine the camera modes that are reported by the sensor you are using. As an example the same camera from below may report differently on a Jetson Nano B01:
29
-
30
-
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000
31
-
32
-
You should adjust accordingly. As an example, for 3264x2464 @ 21 fps on sensor_id 1 of a Jetson Nano B01:
Also, it's been noticed that the display transform is sensitive to width and height (in the above example, width=816, height=616). If you experience issues, check to see if your display width and height is the same ratio as the camera frame size selected (In the above example, 816x616 is 1/4 the size of 3264x2464).
29
+
Note: The cameras may report differently than show below. You can use the simple gst-launch example above to determine the camera modes that are reported by the sensor you are using.
30
+
```
31
+
GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;
39
32
```
40
33
41
-
There are several examples:
34
+
Also, the display transform may be sensitive to width and height (in the above example, width=960, height=540). If you experience issues, check to see if your display width and height is the same ratio as the camera frame size selected (In the above example, 960x540 is 1/4 the size of 1920x1080).
35
+
42
36
43
-
Note: You may need to install numpy for the Python examples to work, ie $ pip3 install numpy
37
+
## Samples
44
38
45
-
simple_camera.py is a Python script which reads from the camera and displays to a window on the screen using OpenCV:
46
39
40
+
### simple_camera.py
41
+
simple_camera.py is a Python script which reads from the camera and displays the frame to a window on the screen using OpenCV:
42
+
```
47
43
$ python simple_camera.py
44
+
```
45
+
### face_detect.py
48
46
49
47
face_detect.py is a python script which reads from the camera and uses Haar Cascades to detect faces and eyes:
50
-
48
+
```
51
49
$ python face_detect.py
52
-
50
+
```
53
51
Haar Cascades is a machine learning based approach where a cascade function is trained from a lot of positive and negative images. The function is then used to detect objects in other images.
The final example is dual_camera.py. This example is for the newer rev B01 of the Jetson Nano board, identifiable by two CSI-MIPI camera ports. This is a simple Python program which reads both CSI cameras and displays them in a window. The window is 960x1080. For performance, the script uses a separate thread for reading each camera image stream. To run the script:
61
+
This example is for the newer Jetson boards (Jetson Nano, Jetson Xavier NX) with two CSI-MIPI camera ports. This is a simple Python program which reads both CSI cameras and displays them in one window. The window is 1920x540. For performance, the script uses a separate thread for reading each camera image stream. To run the script:
66
62
67
63
```
68
64
$ python3 dual_camera.py
69
65
```
70
66
71
-
The directory 'instrumented' contains instrumented code which can help adjust performance and frame rates.
67
+
### simple_camera.cpp
68
+
The last example is a simple C++ program which reads from the camera and displays to a window on the screen using OpenCV:
0 commit comments