Skip to content

A State Estimation Package for Quadruped Robots, that fuses Proprioceptive and Exteroceptive data

License

Notifications You must be signed in to change notification settings

iit-DLSLab/muse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 

Repository files navigation

MUSE: A Real-Time Multi-Sensor State Estimator for Quadruped Robots

Ylenia Nisticò, João Carlos Virgolino Soares, Lorenzo Amatucci, Geoff Fink and Claudio Semini

This paper has been accepted to IEEE Robotics and Automation Letters, and it is available at https://arxiv.org/abs/2503.12101

muse_cropped

💻 Code

The muse package provides a ROS node and utilities for estimating the state of a quadruped robot using sensor data. It includes algorithms for state estimation, sensor fusion, and filtering.

This first version of the code provides a proprioceptive state estimator for quadruped robots. The necessary inputs are

  • imu measurements
  • joint states
  • force exerted on the feet

Additional code to fuse exteroceptive measurements will be available soon! TODO list at the end of the page

🦖 Prerequisites

⚠️ Don't worry! In this repo, we provide Dockerization to avoid dealing with the dependencies!

🛠️ Building and Running

To install the muse package, follow these steps:

  1. Clone this repository and build the Docker image:

    git clone [email protected]:iit-DLSLab/MUSE.git
    cd muse
    docker build -t muse-docker .
  2. Enter the docker and build using catkin_make:

    cd muse_ws
    xhost +local:docker
    docker run -it --rm --name muse -v "$(pwd)":/root/muse_ws -w  /root/muse_ws muse-docker
    catkin_make -j$(proc) install
    source devel/setup.bash  
  3. To launch the state estimator node:

    roslaunch state_estimator state_estimator.launch

If you need to read the data from a rosbag, you need to mount the folder where you store your rosbags, to make it visible inside the image, and then, you can attach a docker image in another terminal, for example:

docker run -it --rm --name muse -v /your_path_to_rosbags:/root/rosbags  -v "$(pwd)":/root/muse_ws -w /root/muse_ws muse-docker (terminal 1)
docker exec -it muse bash (terminal 2)
source devel/setup.bash
cd ~/rosbags (terminal 2)
rosbag play your_rosbag.bag (terminal 2)

To change the name of the topics, check the config foder.

To visualize your data, you can use PlotJuggler which is already installed in the docker image:

rosrun plotjuggler plotjuggler

⚠️ In this repo we provide an example with the ANYmal B300 robot. If you want to test MUSE with another one, you only need to add the URDF of your robot in this folder, and change the name of the legs in the leg odometry plugin, line 249:

std::vector<std::string> feet_frame_names = {"LF_FOOT", "RF_FOOT", "LH_FOOT", "RH_FOOT"};   // Update with your actual link names

📜 TODO list

  • Extend the code to include exteroception
  • Dockerization
  • Support for ROS2

🤗 Contributing

Contributions to this repository are welcome.

Citing the paper

If you like this work and would like to cite it (thanks):

@ARTICLE{10933515,
  author={Nisticò, Ylenia and Soares, João Carlos Virgolino and Amatucci, Lorenzo and Fink, Geoff and Semini, Claudio},
  journal={IEEE Robotics and Automation Letters}, 
  title={MUSE: A Real-Time Multi-Sensor State Estimator for Quadruped Robots}, 
  year={2025},
  volume={},
  number={},
  pages={1-8},
  keywords={Robots;Sensors;Robot sensing systems;Legged locomotion;Odometry;Cameras;Laser radar;Robot vision systems;Robot kinematics;Quadrupedal robots;state estimation;localization;sensor fusion;quadruped robots},
  doi={10.1109/LRA.2025.3553047}}

This repo is maintained by Ylenia Nisticò

About

A State Estimation Package for Quadruped Robots, that fuses Proprioceptive and Exteroceptive data

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published