Skip to content

Commit 5e256eb

Browse files
Final Repository Cleanup & Tutorials
Final Repository Cleanup
2 parents ec79bcb + 03ca596 commit 5e256eb

14 files changed

+314
-994
lines changed

.gitmodules

-3
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,6 @@
22
[submodule "urdf_files/A1/unitree_ros"]
33
path = urdf_files/A1/unitree_ros
44
url = https://github.com/unitreerobotics/unitree_ros.git
5-
[submodule "urdf_files/HyQ/hyq-description"]
6-
path = urdf_files/HyQ/hyq-description
7-
url = https://github.com/iit-DLSLab/hyq-description.git
85
[submodule "urdf_files/Go1/unitree_ros"]
96
path = urdf_files/Go1/unitree_ros
107
url = https://github.com/unitreerobotics/unitree_ros.git

README.md

+44-19
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
1-
# MI-HGNN for contact estimation/classification on various robots
2-
This repository implements a Morphology-Informed Heterogeneous Graph Neural Network (MI-HGNN) for estimating contact information on the feet of a quadruped robot.
1+
# MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network
2+
This repository implements a Morphology-Inspired Heterogeneous Graph Neural Network (MI-HGNN) for estimating contact information on the feet of a quadruped robot. For more details, see our publication "[MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception](https://arxiv.org/abs/2409.11146)" and our [project page](https://lunarlab-gatech.github.io/Morphology-Informed-HGNN/).
33

4-
Additionally, by providing a compatible URDF file, this software can convert a variety of robot structures to graph format for learning with the MI-HGNN. See [#Applying-MI-HGNN-to-your-own-robot
5-
](#applying-mi-hgnn-to-your-own-robot) for more information.
4+
Additionally, it can be applied to a variety of robot structures and datasets, as our software can convert compatible robot URDF files to graph format and provides a template for implementing custom datasets. See [#Applying-MI-HGNN-to-your-own-robot](#applying-mi-hgnn-to-your-own-robot) for more information.
65

76
![Figure 2](paper/website_images/banner_image.png)
87

9-
For information on our method, see our [project page](https://lunarlab-gatech.github.io/Morphology-Informed-HGNN/) and [paper](https://arxiv.org/abs/2409.11146).
8+
## Setup
9+
---
1010

11-
## Installation
11+
### Installation
1212
To get started, setup a Conda Python environment with Python=3.11:
1313
```
1414
conda create -n mi-hgnn python=3.11
@@ -22,37 +22,62 @@ pip install .
2222

2323
Note, if you have any issues with setup, refer to `environment_files/README.md` so you can install the exact libraries we used.
2424

25-
## URDF Download
25+
### URDF Download
2626
The necessary URDF files are part of git submodules in this repository, so run the following commands to download them:
2727
```
2828
git submodule init
2929
git submodule update
3030
```
3131

32-
## Replicating Paper Experiments
32+
## Usage
33+
---
3334

34-
To replicate the experiments referenced in our paper or access our trained model weights, see `paper/README.md`.
35+
### Replicating Paper Experiments
3536

36-
## Applying MI-HGNN to your own robot
37+
We provide code for replicating the exact experiments in our paper and provide full model weights for every model referenced in our paper. See `paper/README.md` for more information.
3738

38-
Although in our paper, we only applied the MI-HGNN on quadruped robots for contact perception, it can also be applied to other multi-body dynamical systems. New URDF files can be added by following the instructions in `urdf_files/README.md`, and our software will automatically convert the URDF into a graph compatible for learning with the MI-HGNN.
39+
<img src="paper/website_images/figure5.png" alt="Parameter sizes and Ablation study" width="600">
3940

40-
## Editing and Contributing
41+
### Applying to your Robot/Dataset
4142

42-
Datasets can be found in the `src/mi_hgnn/datasets_py` directory, and model definitions and training code can be found in the `src/mi_hgnn/lightning_py` directory. We encourage you to extend the library for your own applications. Please reference [#Replicating-Paper-Experiments](#replicating-paper-experiments) for examples on how to train and evaluate models with our repository.
43+
Although our paper's scope was limited to application of MI-HGNN on quadruped robots for contact perception, it can easily be applied to other multi-body dynamical systems and on other tasks/datasets, following the steps below:
4344

44-
After making changes, rebuild the library following the instructions in [#Installation](#installation). To make sure that your changes haven't
45-
broken critical functionality, run the test cases found in the `tests` directory.
45+
<img src="paper/website_images/MI-HGNN Potential Applications.png" alt="MI-HGNN Potential Applications" width="800">
4646

47-
If you'd like to contribute to the repository, write sufficient and necessary test cases for your additions in the `tests` directory, and then open a pull request.
47+
1. Add new URDF files for your robots by following the instructions in `urdf_files/README.md`. Our software will automatically convert the URDF into a graph compatible for learning with the MI-HGNN.
48+
2. Incorporate your custom dataset using our `FlexibleDataset` class and starter `CustomDatasetTemplate.py` file by following the instructions at `src/mi_hgnn/datasets_py/README.md`.
49+
3. After making your changes, rebuild the library following the instructions in [#Installation](#installation). To make sure that your changes haven't
50+
broken critical functionality, run the test cases with the command `python -m unittest discover tests/ -v`.
51+
4. Using the files in the `research` directory as an example, call our `train_model` and `evaluate_model` functions provided in `src/mi_hgnn/lightning_py/gnnLightning.py` with defined train, validation, and test sequences.
4852

49-
## Citation
53+
We've designed the library to be easily applicable to a variety of datasets and robots, and have provided a variety of customization options in training, dataset creation, and logging. We're excited to see everything you can do with the MI-HGNN!
54+
55+
56+
### Simulated A1 Dataset
57+
58+
To evaluate the performance of our model on GRF estimation, we generated our own simulated GRF dataset, which we now contribute to the community as well. We recorded proprioceptive sensor data and the corresponding ground truth GRFs by operating an A1 robot in the [Quad-SDK](https://github.com/lunarlab-gatech/quad_sdk_fork) simulator. In total, our dataset comprises of 530,779 synchronized data samples with a variety of frictions, terrains, and speeds. All of the different sequences are outlined in the table below:
59+
60+
<img src="paper/grf_dataset_sequences.png" alt="GRF Dataset Planned Control" width="700">
61+
62+
A visualization of the various data collection environments can be seen below.
63+
64+
![Figure 4](paper/website_images/figure4.png)
65+
66+
If you'd like to use this dataset, the recorded sequences can be found on [Dropbox](https://www.dropbox.com/scl/fo/4iz1oobx71qoceu2jenie/AJPggD4yIAFXf5508wBz-hY?rlkey=4miys9ap0iaozgdelntms8lxb&st=0oz7kgyq&dl=0). See `paper/README.md` and Section V-B of our publication for specific details on this dataset and how to use it.
67+
68+
## Other Info
69+
---
70+
### Contributing
71+
72+
We encourage you to extend the library for your own applications. If you'd like to contribute to the repository, write sufficient and necessary test cases for your additions in the `tests` directory, and then open a pull request. Reach out to us if you have any questions.
73+
74+
### Citation
5075

5176
If you find our repository or our work useful, please cite the relevant publication:
5277

5378
```
5479
@article{butterfield2024mi,
55-
title={MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception},
80+
title={{MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception}},
5681
author={Butterfield, Daniel and Garimella, Sandilya Sai and Cheng, Nai-Jen and Gan, Lu},
5782
journal={arXiv preprint arXiv:2409.11146},
5883
year={2024},
@@ -61,6 +86,6 @@ If you find our repository or our work useful, please cite the relevant publicat
6186
}
6287
```
6388

64-
## Contact / Issues
89+
### Contact / Issues
6590

6691
For any issues with this repository, feel free to open an issue on GitHub. For other inquiries, please contact Daniel Butterfield ([email protected]) or the Lunar Lab (https://sites.gatech.edu/lunarlab/).

paper/README.md

+11-1
Original file line numberDiff line numberDiff line change
@@ -99,11 +99,21 @@ Finally, Figure 5 is generated by running the `create_regression_plots.py` file,
9999

100100
### GRF Quad-SDK Dataset
101101

102-
For this experiment, we used a dataset that we generated ourselves using Quad-SDK and Gazebo. Our modified fork that we used can be found here: [quad-sdk-fork](https://github.com/lunarlab-gatech/quad_sdk_fork). We generated a total of 21 sequences. The following table relates the dataset sequence name (in code) to the corresponding parameters used for that sequence:
102+
For this experiment, we generated a dataset using Quad-SDK and Gazebo. The dataset consists of synchronized proprioceptive sensor measurements for a simulated A1 robot at a maximum of 500 Hz, including joint angle, joint angular velocity, and joint torque from 12 joint encoders, base linear acceleration, base angular velocity from IMU, and GRFs for each leg in Z direction. It also includes the ground truth robot pose, represented as a translation and a quaternion.
103+
104+
We generated a total of 21 sequences. The following table relates the dataset sequence name (in code) to the corresponding parameters used for that sequence:
103105

104106
![GRF Dataset Sequences](grf_dataset_sequences.png)
105107

106108
In each sequence, the operator loosely followed the high-level control instructions and timings seen below:
107109

108110
<img src="grf_dataset_planned_control.png" alt="GRF Dataset Planned Control" width="150">
109111

112+
The dataset files can be found on [Dropbox](https://www.dropbox.com/scl/fo/4iz1oobx71qoceu2jenie/AJPggD4yIAFXf5508wBz-hY?rlkey=4miys9ap0iaozgdelntms8lxb&st=0oz7kgyq&dl=0), and the code that we used to generate this dataset can be found here: [quad-sdk-fork](https://github.com/lunarlab-gatech/quad_sdk_fork). To find the file corresponding to a sequence in the table above, notice that the files are named with the following convention below:
113+
114+
```
115+
robot_1_a1_<speed>_<terrain>_<friction>_trial1_<date>.bag
116+
```
117+
118+
Also, the file at `src/mi_hgnn/datasets_py/quadSDKDataset.py` also contains sequence name to file mappings.
119+
Loading

paper/website_images/figure4.png

1.07 MB
Loading
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,118 @@
1+
from .flexibleDataset import FlexibleDataset
2+
import scipy.io as sio
3+
from pathlib import Path
4+
import numpy as np
5+
6+
class CustomDataset(FlexibleDataset):
7+
8+
9+
# ========================= DOWNLOADING ==========================
10+
def get_downloaded_dataset_file_name(self):
11+
"""
12+
Type the name of the file extension of your dataset sequence files here!
13+
"""
14+
return "data.<YOUR_EXTENSION_HERE>"
15+
16+
# ========================= PROCESSING ===========================
17+
def process(self):
18+
# Load the path to the downoaded file
19+
path_to_file = Path(self.root, 'raw', 'data.<YOUR_EXTENSION_HERE>')
20+
21+
# TODO: Convert into a MATLAB data dictionary format here!
22+
mat_data = None
23+
24+
# Make sure to save it at this location
25+
sio.savemat(Path(self.root, 'processed', 'data.mat'), mat_data)
26+
27+
# TODO: Get the number of dataset entries in the file
28+
dataset_entries = None
29+
30+
# Write a txt file to save the dataset length & and first sequence index
31+
with open(str(Path(self.processed_dir, "info.txt")), "w") as f:
32+
file_id, loc = self.get_file_id_and_loc()
33+
f.write(str(dataset_entries) + " " + file_id)
34+
35+
# ============= DATA SORTING ORDER AND MAPPINGS ==================
36+
def get_urdf_name_to_dataset_array_index(self) -> dict:
37+
"""
38+
Implement this function to tell `FlexibleDataset` how
39+
the data returned by `load_data_at_dataset_seq()` corresponds
40+
to the joints in the robot URDF file.
41+
42+
Traditionally a robot only has one base node, so it should get a value
43+
of 0. Next, type the name of each leg joint in the URDF file, and add
44+
the index of its value in the corresponding joint arrays returned by
45+
load_data_at_dataset_seq(). Do the same for the joints in the URDF
46+
representing a fixed foot, with the indices of their values in the foot
47+
position and foot velocity arrays.
48+
"""
49+
50+
return {
51+
'<URDF_BASE_NODE>': 0,
52+
53+
'<URDF_JOINT_NODE>': 2,
54+
'<URDF_JOINT_NODE2>': 0,
55+
'<URDF_JOINT_NODE3>': 1,
56+
57+
'<URDF_FOOT_NODE>': 1,
58+
'<URDF_FOOT_NODE2>': 0,
59+
}
60+
61+
# ===================== DATASET PROPERTIES =======================
62+
def get_expected_urdf_name(self):
63+
return "<EXPECTED_URDF_NAME_HERE>"
64+
65+
# ======================== DATA LOADING ==========================
66+
def load_data_at_dataset_seq(self, seq_num: int):
67+
"""
68+
When this function is called, the .mat file data saved in process()
69+
is available at self.mat_data.
70+
71+
For information on the expected format of these variables, see the
72+
load_data_at_dataset_seq() function defition in flexibleDataset.py.
73+
"""
74+
75+
# TODO: Load the data as numpy arrays, and don't forget to incorporate self.history_length
76+
# to load a history of measurments.
77+
lin_acc = None
78+
ang_vel = None
79+
j_p = None
80+
j_v = None
81+
j_T = None
82+
f_p = None
83+
f_v = None
84+
contact_labels = None
85+
r_p = None
86+
r_o = None
87+
timestamps = None
88+
# Note, if you don't have data for a specific return value, just return None,
89+
# and `FlexibleDataset` will know not to use it if it is not required.
90+
91+
return lin_acc, ang_vel, j_p, j_v, j_T, f_p, f_v, contact_labels, r_p, r_o, timestamps
92+
93+
# ================================================================
94+
# ===================== DATASET SEQUENCES ========================
95+
# ================================================================
96+
97+
class CustomDataset_sequence1(CustomDataset):
98+
"""
99+
To load a dataset sequence from Google, first upload the corresponding file on Google Drive, set "General Access"
100+
to "Anyone with the link", and then copy the link. Paste the link, and then extract the string between the text of
101+
'/file/d/' and '/view?usp=sharing'. Take this string, and paste it as the first return argument below.
102+
"""
103+
def get_file_id_and_loc(self):
104+
return "<Your_String_Here>", "Google"
105+
106+
class CustomDataset_sequence2(CustomDataset):
107+
"""
108+
To load a dataset sequence from Dropbox, first you'll need to upload the corresponding file on Dropbox and
109+
generate a link for viewing. Make sure that access is given to anyone with the link, and that this permission won't
110+
expire, doesn't require a password, and allows for downloading. Finally, copy and paste the link as the first return
111+
argument below, but change the last number from 0 to 1 (this tells Dropbox to send the raw file, instead of a webpage).
112+
"""
113+
def get_file_id_and_loc(self):
114+
return "<Your_Link_Here>", "Dropbox"
115+
116+
"""
117+
Create classes for each of your sequences...
118+
"""

src/mi_hgnn/datasets_py/README.md

+63
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
# Implementing Custom Datasets
2+
3+
We hope that many people use our MI-HGNN on a variety of datasets. We provide the `FlexibleDataset` class which provides many convenient features and can be inherited for use with custom datasets. Below is a short summary of its features:
4+
- Automatic download of relevant datasets from the Internet (from Google Drive or Dropbox).
5+
- Data sorting to match the order of joint, foot, and base nodes in the robot graph.
6+
- Wrapper for the `RobotGraph` class that generates the graph from the robot URDF file.
7+
- Easy customization with custom history lengths and a normalization parameter.
8+
- Provides custom get() function returns for training both an MLP and the MI-HGNN.
9+
- Option for easy evaluation on floating-base dynamics model, though our current implementation is specific for the simulated A1 robot in our paper, meaning changes will be necessary for proper results on your robot.
10+
11+
However, `FlexibleDataset` currently only supports the following input data:
12+
- lin_acc (np.array) - IMU linear acceleration
13+
- ang_vel (np.array) - IMU angular velocity
14+
- j_p (np.array) - Joint positions
15+
- j_v (np.array) - Joint velocities
16+
- j_T (np.array) - Joint Torques
17+
- f_p (np.array) - Foot position
18+
- f_v (np.array) - Foot velocity
19+
- labels (np.array) - The Dataset labels (either Z direction GRFs, or contact states)
20+
- r_p (np.array) - Robot position (GT)
21+
- r_o (np.array) - Robot orientation (GT) as a quaternion, in the order (x, y, z, w)
22+
- timestamps (np.array) - Array containing the timestamps of the data
23+
24+
Also note that not all of these are used depending on the applied model (MLP vs. MIHGNN vs Floating-Base Dynamics).
25+
26+
If `FlexibleDataset` supports your input data, then you can easily use it by writing a simple dataset class that inherits from `FlexibleDataset`, similar to `LinTzuYaunDataset` or `QuadSDKDataset`. We've provided a template for you in the `CustomDatasetTemplate.py` file, which you can use to start.
27+
28+
## Using the Custom Dataset Template
29+
30+
This section will explain how to edit the `CustomDatasetTemplate.py` file for use with your own dataset to take advantage of the features of the `FlexibleDataset` class.
31+
32+
First, open the file and rename the class to your liking.
33+
34+
### Adding Dataset Sequences
35+
Next, scroll down to the bottom of the file where it says `DATASET SEQUENCES`. Add every sequence of your dataset as its own class, which will require you to upload the data either to Dropbox or Google. See `CustomDatasetTemplate.py` for details.
36+
37+
This is a clean way for data loading, as it allows the user to later combine different sequences as they'd like with the `torch.utils.data.ConcatDataset` function (see `research/train_classification_sample_eff.py` for an example). Defining these classes also means that training an MI-HGNN model on a different computer doesn't require the user to manually download any datasets, as `FlexibleDataset` will do it for you.
38+
39+
Also, when the files are downloaded, they will be renamed to the value provided by `get_downloaded_dataset_file_name()`. Overwrite this function so that the file extension is correct (`.mat` for a Matlab file, `.bag` for a ROSbag file, etc).
40+
41+
### Implementing Data Processing
42+
Now that you can load your dataset files, you need to implement processing. This step should be implemented in `process()`, and should convert the file from whatever format it is currently in into a `.mat` file for fast training speeds. You'll also need to provide code for extracting the number of dataset entries in this sequence, which will be saved into a .txt file for future use.
43+
44+
Implement this function. You can see `quadSDKDataset.py` for an example of converting a ROSbag file into a .mat file.
45+
46+
### Implementing Data Loading
47+
Now that data is loaded and processed, you can now implement the function for opening the .mat file and extracting the relevant dataset sequence.
48+
This should be done in `load_data_at_dataset_seq()`. The .mat file you saved in the last step will now be available at `self.mat_data` for easy access.
49+
Note that this function will also need to use the `self.history_length` parameter to support training with a history of measurements. See `CustomDatasetTemplate.py` for details, and see `LinTzuYaunDataset.py` for a proper implementation of this function.
50+
51+
### Setting the proper URDF file
52+
Since its easy for the user to provide the wrong URDF file for a dataset sequence, `FlexibleDataset` checks that the URDF file provided by the user matches what the dataset expects. You can tell `FlexibleDataset` which URDF file should be used with this dataset by going to the URDF file and copying the name found at the top of the file, like pictured below:
53+
54+
```
55+
<robot name="miniCheetah">
56+
```
57+
58+
This name should be pasted into `get_expected_urdf_name()`.
59+
60+
### Facilitating Data Sorting
61+
Finally, the last step is to tell `FlexibleDataset` what order your dataset data is in. For example, which index in the joint position array corresponds to a specific joint in the URDF file? To do this, you'll implement `get_urdf_name_to_dataset_array_index()`. See `CustomDatasetTemplate.py` for more details.
62+
63+
After doing this, your dataset will work with our current codebase for training MLP and MI-HGNN models! You can now instantiate your dataset and use it like in the examples in the `research` directory. Happy Training!

0 commit comments

Comments
 (0)