Skip to content

Commit 972bfde

Browse files
authored
Rover f25 quests (#38)
* Move s25 rover to archive * Upload F24 tasks for rover * Update point hierarchy
1 parent 87bc8b9 commit 972bfde

File tree

2 files changed

+104
-0
lines changed

2 files changed

+104
-0
lines changed

pages/quest_books/s25_rover_quests.mdx renamed to pages/quest_books/archive/s25_rover_quests.mdx

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,9 @@ The objectives for Spring 2025 focus on building upon [Winter 2025](https://wiki
2121
6. **Write three blogs on simulating the rover**
2222
- We are a relatively new team and hope to improve our outreach and provide documentation for future members by writing blog posts throughout the term showcasing our progress.
2323

24+
25+
26+
2427
### Term Objectives and Scoring
2528

2629
1. **Simulate depth camera data**
@@ -35,6 +38,9 @@ The objectives for Spring 2025 focus on building upon [Winter 2025](https://wiki
3538

3639
**Minimum Requirements:** One or two depth cameras are generating but not publishing data to Nav2 for a score of 3/10.
3740

41+
Score: 10/10
42+
Reflection: We are building our costmap without Nav2 now. The cameras publish point clouds in the form of [PointCloud2](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html) messages.
43+
3844
2. **Generate costmap from depth camera data**
3945

4046
| Score | Criteria |
@@ -47,6 +53,8 @@ The objectives for Spring 2025 focus on building upon [Winter 2025](https://wiki
4753

4854
**Minimum Requirements:** Depth cameras are able to simulate an innacurate costmap for a score of 5/10.
4955

56+
Score: 5/10
57+
5058
3. **Rover localization**
5159

5260
| Score | Criteria |
@@ -58,6 +66,7 @@ The objectives for Spring 2025 focus on building upon [Winter 2025](https://wiki
5866

5967
**Minimum Requirements:** Accurate localization is achieved and tested using simulated data for a score of 7/10.
6068

69+
Score: 0/10. Priorities shifted onto Rover quests and we did not get to this unfortunately.
6170

6271
4. **Compute and publish goal pose of detected object**
6372

@@ -69,6 +78,7 @@ The objectives for Spring 2025 focus on building upon [Winter 2025](https://wiki
6978

7079
**Minimum Requirements:** Rover is able to compute and publish an approximate pose for a score above 7/10.
7180

81+
Score: 7/10. Object detection successfully integrated, but there are some latency and detection hiccups.
7282

7383
5. **Autonomously navigate to chosen point**
7484

@@ -81,6 +91,8 @@ The objectives for Spring 2025 focus on building upon [Winter 2025](https://wiki
8191

8292
**Minimum Requirements:** Rover is able to inefficiently navigate to a point for a score above 7/10.
8393

94+
Score: 0/10. Dependent on planner module, which was not completed.
95+
8496
6. **Blogs**
8597

8698
| Score | Criteria |
@@ -93,6 +105,13 @@ The objectives for Spring 2025 focus on building upon [Winter 2025](https://wiki
93105

94106
**Minimum Requirements:**: Three blogs have been written on simulating software for the rover for a score above 15/20.
95107

108+
Score: 0/10. Focus was too much on development, did not make time to blog about our progress.
109+
110+
## Other accomplishments
111+
- Terrain simulation (boulders) in gazebo
112+
- USGS Lidar data/elevation map
113+
- Object detection
114+
96115
## Scoring Template
97116

98117
| Quest Name | Description | Due Date | Score |
Lines changed: 85 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,85 @@
1+
# WATO Rover F24 Roadmap
2+
3+
## Costmap Completion
4+
5+
- **BIG TASK:** Utilize the depth cameras’ pointcloud data to populate a costmap.
6+
- **Note:** To make things simple, let us first accomplish the task of populating a “local” costmap centered on the rover’s body. This would be an NxN occupancy grid, similar to the ASD onboarding assignment. Eventually, we will use these local costmaps to populate a global costmap of the entire field.
7+
- **Note:** We utilize two Intel Realsense D435 depth cameras for obstacle avoidance.
8+
- **Note:** The ROS 2 topics that our depth cameras publish point cloud messages to are:
9+
- `/sim/realsense1/depth/points`
10+
- `/sim/realsense2/depth/points`
11+
- **Note:** The message type of the point cloud messages we receive on the camera topics are [PointCloud2 ROS 2 messages](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html).
12+
- It can be hard to extract the position data from these messages, which is why we convert the PointCloud2 messages to **pcl::PointXYZRGB** message.
13+
14+
- **TASK:** Add `config/params.yaml` file for costmap and occupancy grid dimension parameters (and any other parameters), similar to the ASD assignment.
15+
- **TASK:** Determine if downsampling of the point cloud is necessary. If so, look into and use functions from the pcl library for downsampling.
16+
- **TASK:** Use the **Bresenham Line Algorithm** to populate the occupancy grid. The idea is to work with the “imaginary” ray cast from the camera origin to the point in 3D space and determine which cells of the occupancy grid this ray passes through. This enables us to determine which cells are free and which are occupied.
17+
- **TASK:** Apply an inflation radius/layer on the occupancy grid.
18+
- **TASK:** Create “map memory” node that updates a global costmap (not frequently though) using the local costmap data.
19+
- **BIG TASK:** Incorporate the elevation data of the University Rover Competition site into the costmap.
20+
- **Note:** The URC takes place at the [Mars Desert Research Station in Utah](https://www.google.com/maps/dir//38.406422,-110.791921).
21+
- **Note:** The “elevation data” we obtained for the site is in the form of a `.las` file (standard format for point cloud data) obtained from the [USGS Lidar Explorer](https://apps.nationalmap.gov/lidar-explorer/#/process).
22+
- **TASK:** Download the `.las` file from our repo: [fargate_utah_mdrs.las](https://github.com/WATonomous/wato_rover/blob/master/assets/fargate_utah_mdrs.las).
23+
- To visualize the point cloud in your browser, go to [cloud.usbim.com](https://cloud.usbim.com/home/workspaces), create an account and upload the `.las` file.
24+
25+
- **TASK:** Divide the competition site into regions (e.g. grid squares?) and assign a cost to each region based on the elevation indicated by the point cloud `.las` file.
26+
27+
- How to process `.las` files?
28+
- For C++, there is the [libLAS library](https://liblas.org/start.html#overview).
29+
- For Python, there is [laspy](https://laspy.readthedocs.io/en/latest/).
30+
31+
---
32+
33+
## Object Detection Completion
34+
35+
- **Note:** The rover needs to identify mission-critical objects (rubber mallets and water bottles) and navigate toward them. See the autonomous navigation mission from last year’s competition guidelines: [URC Requirements](https://urc.marssociety.org/home/requirements-guidelines).
36+
- **BIG TASK:** Fine tune the YOLOv8 model for better performance on mallets and water bottles (required objects for the competition).
37+
- Dataset previously used: [Roboflow Mallet Dataset](https://universe.roboflow.com/sc-robotics/mallet-0ga9i/dataset/8).
38+
- Test algorithm on videos (rosbags, ROS datasets): [ROS 2 Dataset](https://robotisim.com/ros2-dataset/).
39+
40+
---
41+
42+
## Aside: Rosbags
43+
44+
- What are rosbags? If you’ve done the ASD onboarding assignment, you know what ROS 2 topics/messages are.
45+
- A **rosbag** is a file that stores a log of timestamped ROS messages published to a topic. It’s like a “recording” of the messages you’ve published to a topic.
46+
- You can “record” a rosbag using:
47+
48+
```bash
49+
ros2 bag record <topic1> <topic2><topicN>
50+
```
51+
[rosbag2 GitHub](https://github.com/ros2/rosbag2?tab=readme-ov-file)
52+
53+
- The recording can be stopped by manually terminating it (`Ctrl+C`), or by adding parameters to the command ahead for the duration of the recording in time or number of messages. Check the `rosbag2` documentation for more information.
54+
- Why is this “recording” useful? You can “playback” a rosbag later, using:
55+
56+
```bash
57+
ros2 bag play my_bag
58+
```
59+
60+
This will republish the messages to the same topic, in the same order (because the messages in the rosbag were associated with timestamps).
61+
- This way, we can create datasets of video, point cloud scans, IMU data, etc and test our algorithms on these prepared datasets.
62+
- This is much easier than, say, having to test our obstacle avoidance on the actual rover hardware every time. More frictionless testing leads to more iteration and improvement.
63+
64+
---
65+
66+
## Localization
67+
Complete localization using GPS/IMU and odometry and test on simulated data (rosbags) and on the rover.
68+
- **TASK:** Get VectorNav GPS/IMU hardware to publish messages to ROS 2 topic.
69+
- **TASK:** Create rosbag of example GPS/IMU data published by the VectorNav.
70+
- **BIG TASK**: Implement rover localization using the GPS data
71+
---
72+
73+
## Waypoint Navigation
74+
75+
- Need to navigate autonomously to an arbitrary GPS waypoint.
76+
- Complete the planning and control modules so the rover can autonomously navigate to any waypoint. This is required for navigating to the position of a detected object, as well as navigating to a provided GPS waypoint (another required task for the competition).
77+
- **BIG TASK:** Design planner module for navigation (**A\*** Planner).
78+
79+
---
80+
81+
## Blogging and Outreach
82+
83+
- **TASK:** Write technical blogs explaining how we accomplished our terrain simulation, motor control, object detection, etc last term.
84+
85+
- Get people excited about robotics programming!

0 commit comments

Comments
 (0)