Skip to content

Q-Y-Yang/human-robot-proxemics-models

Repository files navigation

Learnable Human-Robot Proxemics Models

This is the code repository for the paper Learning Human-Robot Proxemics Models from Experimental Data. This README explains how to learn, infer, and evaluate the Human-Robot Proxemic Models, as well as how to integrate them into robot navigation.

Main Dependencies

  • mvem
  • Python 3.10.18

Learning the Models

Proxemic Model

  1. Preprocessing the CongreG8 dataset using our scripts, or preprocessing your custom data.
python data_utils/data_processing_script.py

This script uses two helper scripts: new_extract_chest_data.py and automate_normalized_chest_data.py.

  • new_extract_chest_data.py extracts chest data (position and rotation) from the human dataset and saves it to the folder all_chest_data (~140 MB).
  • automate_normalized_chest_data.py calculates the relative position and orientation data and saves it to the folder all_rel_chest_data (~755 MB).

Finally, the folder structure should look like this:

data_utils/
├── all_chest_data/
└── all_rel_chest_data/
  1. Learning the parameters of the bivariate skew normal distrubution using mvem library (Expectation Maximization):
learn_proxemic_model.ipynb

Interaction-position Model

  1. Preprocessing the CongreG8 dataset using our scripts, or preprocessing your custom data (the same as Step 1 of Proxemic Model).

  2. Learning a KDE model for interaction positions and save the model"

learn_interaction_points.ipynb

Applying the Models

  • proxemic_infer_vis.ipynb: Applying the learned proxemic model on test group and visualizing.
  • interaction_infer_vis.py: Applying the learned interaction-position model on test data and plotting.

Evaluation

  • evaluation_proxemic.py: quantitative evaluation for the proxemic model.
  • evaluation_interaction_kde.py: quantitative evaluation for the interaction model.
  • evaluation_asym_gaus.py: quantitative evaluation for a baseline model.

Deployment

  1. Launch your robot.
  2. Launch the robot's navigation package. Add a social layer plugin into the costmap configuration such as global_costmap.yaml or local_costmap.yaml. For example,
plugins:
    - name: social_layer
      type: 'social_layer/SocialCostmapLayer'

social_layer:
    enabled            : true
    max_time_passed    : 10
    gaussian_renorming : 150
    cutoff: 0.0
    amplitude: 77.0
    covariance: 0.25
    factor: 5.0
    keep_time: 0.75
    topic: /social_costmap

  1. Run the social cost node. It subscribes a People topic and generates a social_costmap accordingly.
rosrun social_costmap_ros social_costmap_node.py

Notably, social_costmap_layer package adds the social costmap to the final costmap used for robot navigation.

  • Tip: We need the position and yaw of the detected person. These can also be represented using ROS PoseStamped message type including the 3D position and orientation as a quaternion.

Main Dependencies

  • ROS Noetic

Citation

If you use the code or the learned models from this repository, please cite

@Article{human-robotproxemics,
AUTHOR = {Yang, Qiaoyue and Kachel, Lukas and Jung, Magnus and Al-Hamadi, Ayoub and Wachsmuth, Sven},
TITLE = {Learning Human–Robot Proxemics Models from Experimental Data},
JOURNAL = {Electronics},
VOLUME = {14},
YEAR = {2025},
NUMBER = {18},
ARTICLE-NUMBER = {3704},
URL = {https://www.mdpi.com/2079-9292/14/18/3704},
ISSN = {2079-9292},
DOI = {10.3390/electronics14183704}
}

About

Code of the publication: Learning Human-Robot Proxemics Models from Experimental Data

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published