This is the code repository for the paper Learning Human-Robot Proxemics Models from Experimental Data. This README explains how to learn, infer, and evaluate the Human-Robot Proxemic Models, as well as how to integrate them into robot navigation.
- mvem
- Python 3.10.18
- Preprocessing the CongreG8 dataset using our scripts, or preprocessing your custom data.
python data_utils/data_processing_script.py
This script uses two helper scripts: new_extract_chest_data.py and automate_normalized_chest_data.py.
new_extract_chest_data.pyextracts chest data (position and rotation) from the human dataset and saves it to the folderall_chest_data(~140 MB).automate_normalized_chest_data.pycalculates the relative position and orientation data and saves it to the folderall_rel_chest_data(~755 MB).
Finally, the folder structure should look like this:
data_utils/
├── all_chest_data/
└── all_rel_chest_data/
- Learning the parameters of the bivariate skew normal distrubution using
mvemlibrary (Expectation Maximization):
learn_proxemic_model.ipynb
-
Preprocessing the CongreG8 dataset using our scripts, or preprocessing your custom data (the same as Step 1 of Proxemic Model).
-
Learning a KDE model for interaction positions and save the model"
learn_interaction_points.ipynb
proxemic_infer_vis.ipynb: Applying the learned proxemic model on test group and visualizing.interaction_infer_vis.py: Applying the learned interaction-position model on test data and plotting.
evaluation_proxemic.py: quantitative evaluation for the proxemic model.evaluation_interaction_kde.py: quantitative evaluation for the interaction model.evaluation_asym_gaus.py: quantitative evaluation for a baseline model.
- Launch your robot.
- Launch the robot's navigation package. Add a social layer plugin into the costmap configuration such as
global_costmap.yamlorlocal_costmap.yaml. For example,
plugins:
- name: social_layer
type: 'social_layer/SocialCostmapLayer'
social_layer:
enabled : true
max_time_passed : 10
gaussian_renorming : 150
cutoff: 0.0
amplitude: 77.0
covariance: 0.25
factor: 5.0
keep_time: 0.75
topic: /social_costmap
- Run the social cost node. It subscribes a
Peopletopic and generates asocial_costmapaccordingly.
rosrun social_costmap_ros social_costmap_node.py
Notably, social_costmap_layer package adds the social costmap to the final costmap used for robot navigation.
- Tip: We need the position and yaw of the detected person. These can also be represented using ROS
PoseStampedmessage type including the 3D position and orientation as a quaternion.
- ROS Noetic
If you use the code or the learned models from this repository, please cite
@Article{human-robotproxemics,
AUTHOR = {Yang, Qiaoyue and Kachel, Lukas and Jung, Magnus and Al-Hamadi, Ayoub and Wachsmuth, Sven},
TITLE = {Learning Human–Robot Proxemics Models from Experimental Data},
JOURNAL = {Electronics},
VOLUME = {14},
YEAR = {2025},
NUMBER = {18},
ARTICLE-NUMBER = {3704},
URL = {https://www.mdpi.com/2079-9292/14/18/3704},
ISSN = {2079-9292},
DOI = {10.3390/electronics14183704}
}