A Lightweight and Robust Point-Line Monocular Visual Inertial Wheel Odometry
Authors: Zhixin Zhang, Wenzhi Bai, Liang Zhao and Pawel Ladosz
The code is preparing and will be released soon. 👀
The paper is under review.
We use the KAIST Complex Urban Dataset to test our algorithm.
Examples on KAIST Urban27:
- OpenCV 4.2
- Eigen 3
- ROS noetic
Note❗: This system is based on a monocular setup, please set the camera number to 1 in the Config file.
roslaunch viwo rosbag.launch config:=kaist/kaist_LC path_gt:=urban26.txt path_bag:=urban26.bag
rviz -d mins/launch/display.rviz
For the rosbag files and ground truths used for test, please refer to MINS.
For the benchmark used in our paper, we also open-source the modified version for the convenience of the community. The difference is add KAIST datasets Config file and disabled some functions, like loop closure or re-localization. The code will coming soon.
This project was built on top of the following works.
- OpenVINS: Open-source filter-based visual-inertial estimator.
- MINS: An efficient, robust, and tightly-coupled Multisensor-aided Inertial Navigation System (MINS)
Thanks for the wonderful work and open-source from Robot Perception & Navigation Group (RPNG).