Skip to content

Commit

Permalink
Merge branch 'master' of github.com:HKUST-Aerial-Robotics/VINS-Mono
Browse files Browse the repository at this point in the history
  • Loading branch information
shaozu committed Mar 7, 2019
2 parents 5334365 + 9e657be commit cd8a2bd
Show file tree
Hide file tree
Showing 6 changed files with 61 additions and 19 deletions.
18 changes: 5 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# VINS-Mono
## A Robust and Versatile Monocular Visual-Inertial State Estimator

**11 Jan 2019**: An extension of **VINS**, which supports stereo cameras, is published at [VINS-Fusion](https://github.com/HKUST-Aerial-Robotics/VINS-Fusion)
**11 Jan 2019**: An extension of **VINS**, which supports stereo cameras / stereo cameras + IMU / mono camera + IMU, is published at [VINS-Fusion](https://github.com/HKUST-Aerial-Robotics/VINS-Fusion)

**29 Dec 2017**: New features: Add map merge, pose graph reuse, online temporal calibration function, and support rolling shutter camera. Map reuse videos:

Expand Down Expand Up @@ -33,20 +33,12 @@ alt="Mobile platform" width="240" height="180" border="10" /></a>
MAV application; Mobile implementation (Video link for mainland China friends: [Video1](http://www.bilibili.com/video/av10813254/) [Video2](http://www.bilibili.com/video/av10813205/) [Video3](http://www.bilibili.com/video/av10813089/) [Video4](http://www.bilibili.com/video/av10813325/) [Video5](http://www.bilibili.com/video/av10813030/))

**Related Papers**
* **VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator**, Tong Qin, Peiliang Li, Zhenfei Yang, Shaojie Shen, IEEE Transactions on Robotics[pdf](https://ieeexplore.ieee.org/document/8421746/?arnumber=8421746&source=authoralert)

```
@article{qin2017vins,
title={VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator},
author={Qin, Tong and Li, Peiliang and Shen, Shaojie},
journal={IEEE Transactions on Robotics},
year={2018},
volume={34},
number={4},
pages={1004-1020}}
```
* **Online Temporal Calibration for Monocular Visual-Inertial Systems**, Tong Qin, Shaojie Shen, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS, 2018), **best student paper award** [pdf](https://ieeexplore.ieee.org/abstract/document/8593603)

* **VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator**, Tong Qin, Peiliang Li, Zhenfei Yang, Shaojie Shen, IEEE Transactions on Robotics[pdf](https://ieeexplore.ieee.org/document/8421746/?arnumber=8421746&source=authoralert)

*If you use VINS-Mono for your academic research, please cite at least one of our related papers.*
*If you use VINS-Mono for your academic research, please cite at least one of our related papers.*[bib](https://github.com/HKUST-Aerial-Robotics/VINS-Mono/blob/master/support_files/paper_bib.txt)

## 1. Prerequisites
1.1 **Ubuntu** and **ROS**
Expand Down
11 changes: 11 additions & 0 deletions ar_demo/launch/realsense_ar.launch
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
<launch>
<include file="$(find vins_estimator)/launch/realsense_fisheye.launch"/>

<node pkg="ar_demo" type="ar_demo_node" name="ar_demo_node" output="screen">
<remap from="~image_raw" to="/camera/fisheye/image_raw"/>
<remap from="~camera_pose" to="/vins_estimator/camera_pose"/>
<remap from="~pointcloud" to="/vins_estimator/point_cloud"/>
<param name="calib_file" type="string" value="$(find feature_tracker)/../config/realsense/realsense_fisheye_config.yaml"/>
<param name="use_undistored_img" type="bool" value="false"/>
</node>
</launch>
26 changes: 22 additions & 4 deletions ar_demo/src/ar_demo_node.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,8 @@
#include <queue>
#include <cmath>
#include <algorithm>
#include <sensor_msgs/Image.h>
#include <sensor_msgs/image_encodings.h>

using namespace std;
using namespace Eigen;
Expand Down Expand Up @@ -380,9 +382,25 @@ void callback(const ImageConstPtr& img_msg, const nav_msgs::Odometry::ConstPtr p

project_object(camera_p, camera_q);

cv_bridge::CvImagePtr bridge_ptr = cv_bridge::toCvCopy(img_msg, sensor_msgs::image_encodings::MONO8);
cv_bridge::CvImageConstPtr ptr;
if (img_msg->encoding == "8UC1")
{
sensor_msgs::Image img;
img.header = img_msg->header;
img.height = img_msg->height;
img.width = img_msg->width;
img.is_bigendian = img_msg->is_bigendian;
img.step = img_msg->step;
img.data = img_msg->data;
img.encoding = "mono8";
ptr = cv_bridge::toCvCopy(img, sensor_msgs::image_encodings::MONO8);
}
else
ptr = cv_bridge::toCvCopy(img_msg, sensor_msgs::image_encodings::MONO8);

//cv_bridge::CvImagePtr bridge_ptr = cv_bridge::toCvCopy(ptr, sensor_msgs::image_encodings::MONO8);
cv::Mat AR_image;
AR_image = bridge_ptr->image.clone();
AR_image = ptr->image.clone();
cv::cvtColor(AR_image, AR_image, cv::COLOR_GRAY2RGB);
draw_object(AR_image);

Expand Down Expand Up @@ -501,14 +519,14 @@ int main( int argc, char** argv )
sub_img = n.subscribe("image_raw", 100, img_callback);
}

Axis[0] = Vector3d(0, 2, -1.2);
Axis[0] = Vector3d(0, 1.5, -1.2);
Axis[1]= Vector3d(-10, 5, 0);
Axis[2] = Vector3d(3, 3, 3);
Axis[3] = Vector3d(-2, 2, 0);
Axis[4]= Vector3d(5, 10, -5);
Axis[5] = Vector3d(0, 10, -1);

Cube_center[0] = Vector3d(-2, 0, -1.2 + box_length / 2.0);
Cube_center[0] = Vector3d(0, 1.5, -1.2 + box_length / 2.0);
//Cube_center[0] = Vector3d(0, 3, -1.2 + box_length / 2.0);
Cube_center[1] = Vector3d(4, -2, -1.2 + box_length / 2.0);
Cube_center[2] = Vector3d(0, -2, -1.2 + box_length / 2.0);
Expand Down
4 changes: 2 additions & 2 deletions config/realsense/realsense_fisheye_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ extrinsicTranslation: !!opencv-matrix
data: [0.00188568, 0.00123801, 0.01044055]

#feature traker paprameters
max_cnt: 150 # max feature number in feature tracking
max_cnt: 120 # max feature number in feature tracking
min_dist: 30 # min distance between two features
freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image
F_threshold: 1.0 # ransac threshold (pixel)
Expand Down Expand Up @@ -78,4 +78,4 @@ rolling_shutter_tr: 0 # unit: s. rolling shutter read out time per
#visualization parameters
save_image: 1 # save image in pose graph for visualization prupose; you can close this function by setting 0
visualize_imu_forward: 0 # output imu forward propogation to achieve low latency and high frequence results
visualize_camera_size: 0.4 # size of camera marker in RVIZ
visualize_camera_size: 0.4 # size of camera marker in RVIZ
Empty file.
21 changes: 21 additions & 0 deletions support_files/paper_bib.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@

@inproceedings{qin2018online,
title={Online Temporal Calibration for Monocular Visual-Inertial Systems},
author={Qin, Tong and Shen, Shaojie},
booktitle={2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
pages={3662--3669},
year={2018},
organization={IEEE}
}



@article{qin2017vins,
title={VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator},
author={Qin, Tong and Li, Peiliang and Shen, Shaojie},
journal={IEEE Transactions on Robotics},
year={2018},
volume={34},
number={4},
pages={1004-1020}
}

0 comments on commit cd8a2bd

Please sign in to comment.