You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/algorithms/demonstrations.md
+4-2Lines changed: 4 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,11 +2,13 @@
2
2
3
3
## Collecting Human Demonstrations
4
4
5
-
We provide teleoperation utilities that allow users to control the robots with input devices, such as the keyboard, [SpaceMouse](https://www.3dconnexion.com/spacemouse_compact/en/) and mujoco-gui. Such functionality allows us to collect a dataset of human demonstrations for learning. We provide an example script to illustrate how to collect demonstrations. Our [collect_human_demonstrations](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/scripts/collect_human_demonstrations.py) script takes the following arguments:
5
+
We provide teleoperation utilities that allow users to control the robots with input devices, such as the keyboard, [SpaceMouse](https://www.3dconnexion.com/spacemouse_compact/en/), [DualSense](https://www.playstation.com/en-us/accessories/dualsense-wireless-controller/) and mujoco-gui. Such functionality allows us to collect a dataset of human demonstrations for learning. We provide an example script to illustrate how to collect demonstrations. Our [collect_human_demonstrations](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/scripts/collect_human_demonstrations.py) script takes the following arguments:
6
6
7
7
-`directory:` path to a folder for where to store the pickle file of collected demonstrations
8
8
-`environment:` name of the environment you would like to collect the demonstrations for
9
-
-`device:` either "keyboard" or "spacemouse" or "mjgui"
9
+
-`device:` either "keyboard" or "spacemouse" or "dualsense" or "mjgui"
10
+
-`renderer:` Mujoco's builtin interactive viewer (mjviewer) or OpenCV viewer (mujoco)
11
+
-`camera:` Pass multiple camera names to enable multiple views. Note that the "mujoco" renderer must be enabled when using multiple views, while "mjviewer" is not supported.
10
12
11
13
See the [devices page](https://robosuite.ai/docs/modules/devices.html) for details on how to use the devices.
Copy file name to clipboardExpand all lines: docs/demos.md
+7-1Lines changed: 7 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -121,6 +121,12 @@ The `demo_device_control.py` scripts shows how to teleoperate robot with [contro
121
121
This current implementation only supports macOS (Linux support can be added).
122
122
Download and install the [driver](https://www.3dconnexion.com/service/drivers.html) before running the script.
123
123
124
+
* **DualSense**
125
+
We use the DualSense joystick from [DualSense](https://www.playstation.com/en-us/accessories/dualsense-wireless-controller/) to control the end-effector of the robot. The joystick provides 6-DoF control commands.
126
+
127
+
**Note:**
128
+
Make sure `hidapi` can detect your DualSense in your computer. In Linux, you may add udev rules in `/etc/udev/rules.d` to get access to the device without root privilege. For the rules content you can refer to [game-device-udev](https://codeberg.org/fabiscafe/game-devices-udev).
129
+
124
130
* **Mujoco GUI**
125
131
The Mujoco GUI provides a graphical user interface for viewing and interacting with a mujoco simulation. We use the GUI and a mouse to drag and drop mocap bodies, whose
126
132
poses are tracked by a controller. More specifically, once the mujoco GUI is loaded from running `python demo_device_control.py`, you first need to hit the <Tab> key to reach the interactive mujoco viewer state. Then, you should double click on
@@ -174,7 +180,7 @@ The `demo_renderers.py` script shows how to use different renderers with the sim
174
180
```sh
175
181
$ python demo_renderers.py --renderer default
176
182
```
177
-
The `--renderer` flag can be set to `mujoco` or `default(default)
183
+
The `--renderer` flag can be set to `mujoco` or `default`
178
184
179
185
### Exporting to USD
180
186
Exporting to USD allows users to render **robosuite** trajectories in external renderers such as NVIDIA Omniverse and Blender. In order to export to USD you must install the required dependencies for the exporter.
Copy file name to clipboardExpand all lines: docs/installation.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -37,7 +37,7 @@ The base installation requires the MuJoCo physics engine (with [mujoco](https://
37
37
```
38
38
This will also install our library as an editable package, such that local changes will be reflected elsewhere without having to reinstall the package.
39
39
40
-
3. (Optional) We also provide add-on functionalities, such as [OpenAI Gym](https://github.com/openai/gym)[interfaces](source/robosuite.wrappers), [inverse kinematics controllers](source/robosuite.controllers) powered by [PyBullet](http://bulletphysics.org), and [teleoperation](source/robosuite.devices) with [SpaceMouse](https://www.3dconnexion.com/products/spacemouse.html) devices. To enable these additional features, please install the extra dependencies by running
40
+
3. (Optional) We also provide add-on functionalities, such as [OpenAI Gym](https://github.com/openai/gym)[interfaces](source/robosuite.wrappers), [inverse kinematics controllers](source/robosuite.controllers) powered by [PyBullet](http://bulletphysics.org), and [teleoperation](source/robosuite.devices) with [SpaceMouse](https://www.3dconnexion.com/products/spacemouse.html)and [DualSense](https://www.playstation.com/en-us/accessories/dualsense-wireless-controller/)devices. To enable these additional features, please install the extra dependencies by running
Copy file name to clipboardExpand all lines: docs/modules/devices.md
+26-5Lines changed: 26 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# I/O Devices
2
2
3
-
Devices are used to read user input and teleoperate simulated robots in real-time. This is achieved by either using a keyboard or a [SpaceMouse](https://www.3dconnexion.com/spacemouse_compact/en/), and whose teleoperation capabilities can be demonstrated with the [demo_device_control.py](../demos.html#teleoperation) script. More generally, we support any interface that implements the [Device](../simulation/device) abstract base class. In order to support your own custom device, simply subclass this base class and implement the required methods.
3
+
Devices are used to read user input and teleoperate simulated robots in real-time. This is achieved by either using a keyboard, a [SpaceMouse](https://www.3dconnexion.com/spacemouse_compact/en/) or a [DualSense](https://www.playstation.com/en-us/accessories/dualsense-wireless-controller/) joystick, and whose teleoperation capabilities can be demonstrated with the [demo_device_control.py](../demos.html#teleoperation) script. More generally, we support any interface that implements the [Device](../simulation/device) abstract base class. In order to support your own custom device, simply subclass this base class and implement the required methods.
4
4
5
5
## Keyboard
6
6
@@ -19,10 +19,10 @@ Note that the rendering window must be active for these commands to work.
19
19
| o-p | rotate (yaw) |
20
20
| y-h | rotate (pitch) |
21
21
| e-r | rotate (roll) |
22
-
| b | toggle arm/base mode (if appli cable) |
22
+
| b | toggle arm/base mode (if applicable) |
23
23
| s | switch active arm (if multi-armed robot) |
24
24
| = | switch active robot (if multi-robot env) |
25
-
| ESC| quit |
25
+
|Ctrl+C | quit |
26
26
27
27
28
28
## 3Dconnexion SpaceMouse
@@ -38,8 +38,29 @@ We support the use of a [SpaceMouse](https://www.3dconnexion.com/spacemouse_comp
38
38
| Move mouse laterally | move arm horizontally in x-y plane |
39
39
| Move mouse vertically | move arm vertically |
40
40
| Twist mouse about an axis | rotate arm about a corresponding axis |
41
-
| ESC (keyboard) | quit |
42
-
41
+
| b | toggle arm/base mode (if applicable) |
42
+
| s | switch active arm (if multi-armed robot) |
43
+
| = | switch active robot (if multi-robot environment) |
44
+
| Ctrl+C (keyboard) | quit |
45
+
46
+
## Sony DualSense
47
+
48
+
we support the use of a [Sony DualSense](https://www.playstation.com/en-us/accessories/dualsense-wireless-controller/) as well.
Copy file name to clipboardExpand all lines: docs/overview.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ This framework was originally developed in late 2017 by researchers in [Stanford
18
18
***standardized tasks**: a set of standardized manipulation tasks of large diversity and varying complexity and RL benchmarking results for reproducible research;
19
19
***procedural generation**: modular APIs for programmatically creating new environments and new tasks as combinations of robot models, arenas, and parameterized 3D objects. Check out our repo [robosuite_models](https://github.com/ARISE-Initiative/robosuite_models) for extra robot models tailored to robosuite.
20
20
***robot controllers**: a selection of controller types to command the robots, such as joint-space velocity control, inverse kinematics control, operational space control, and whole body control;
21
-
***teleoperation devices**: a selection of teleoperation devices including keyboard, spacemouse and MuJoCo viewer drag-drop;
21
+
***teleoperation devices**: a selection of teleoperation devices including keyboard, spacemouse, dualsense and MuJoCo viewer drag-drop;
22
22
***multi-modal sensors**: heterogeneous types of sensory signals, including low-level physical states, RGB cameras, depth maps, and proprioception;
23
23
***human demonstrations**: utilities for collecting human demonstrations, replaying demonstration datasets, and leveraging demonstration data for learning. Check out our sister project [robomimic](https://arise-initiative.github.io/robomimic-web/);
24
24
***photorealistic rendering**: integration with advanced graphics tools that provide real-time photorealistic renderings of simulated scenes, including support for NVIDIA Isaac Sim rendering.
Copy file name to clipboardExpand all lines: docs/simulation/device.rst
+13-1Lines changed: 13 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
Device
2
2
======
3
3
4
-
Devices allow for direct real-time interfacing with the MuJoCo simulation. The currently supported devices are ``Keyboard``. ``SpaceMouse`` and ``MjGUI``.
4
+
Devices allow for direct real-time interfacing with the MuJoCo simulation. The currently supported devices are ``Keyboard``. ``SpaceMouse``. ``DualSense`` and ``MjGUI``.
0 commit comments