A consistent and robust tightly-coupled Multi-sensor-aided Inertial Navigation System (MINS) which is capable of fusing all five sensing modalities (inertial, wheel encoders, camera, GNSS, and LiDAR) in a filtering fashion by overcoming the hurdles of computational complexity, sensor asynchronicity, and intra-sensor calibration.
- Inertial(IMU)-based multi-sensor fusion including wheel odometry and arbitrary numbers of cameras, LiDARs, and GNSSs (+ VICON or loop-closure) for localization.
- Online calibration of all onboard sensors (check exemplary results).
- Consistent high-order state on manifold interpolation improved from our prior work (MIMC-VINS) and dynamic cloning strategy for light-weight estimation performance.
- Multi-sensor simulation toolbox for IMU, camera, LiDAR, GNSS, and wheel enhanced from our prior work (OpenVINS)
- Evaluation toolbox for consistency, accuracy, and timing analysis.
- Very detailed options for each sensor enabling general multi-sensor application.
MINS is tested on Ubuntu 18 and 20 and only requires corresponding ROS (Melodic and Noetic).
- Default Eigen version will be 3.3.7 (Noetic) or lower, but if one has a higher version the compilation can be failed due to thirdparty library (libpointmatcher) for LiDAR.
mkdir -p $MINS_WORKSPACE/catkin_ws/src/ && cd $MINS_WORKSPACE/catkin_ws/src/
git clone https://github.com/rpng/MINS
cd .. && catkin build
source devel/setup.bash
roslaunch mins simulation.launch cam_enabled:=true lidar_enabled:=true
roslaunch mins serial.launch config:=kaist/kaist_LC path_gt:=urban30.txt path_bag:=urban30.bag
roslaunch mins subscribe.launch config:=euroc_mav dataset:=rosbag:=V1_03_difficult.bag bag_start_time:=0
rviz -d mins/launch/display.rviz
This project was built on top of the following libraries which are in the thirdparty folder.
- OpenVINS: Open-source filter-based visual-inertial estimator.
- ikd-tree: Incremental k-d tree.
- libpointmatcher: Modular Iterative Closest Point library based on libnabo
This code was written by the Robot Perception and Navigation Group (RPNG) at the University of Delaware. If you have any issues with the code please open an issue on our GitHub page with relevant implementation details and references. For researchers that have leveraged or compared to this work, please cite the following:
The publication reference will be updated soon.
@inproceedings{Lee2021ICRA,
title = {Efficient Multi-sensor Aided Inertial Navigation with Online Calibration},
author = {Woosik Lee and Yulin Yang and Guoquan Huang},
year = 2021,
pages = {5706--5712},
booktitle = {Proc. of the IEEE International Conference on Robotics and Automation},
address = {Xi'an, China},
}
The codebase and documentation is licensed under the GNU General Public License v3 (GPL-3). You must preserve the copyright and license notices in your derivative work and make available the complete source code with modifications under the same license (see this; this is not legal advice).