Skip to content

Commit

Permalink
add UrbanNavDataset test
Browse files Browse the repository at this point in the history
  • Loading branch information
Chenc111 committed Dec 17, 2022
1 parent efb098a commit 6734075
Show file tree
Hide file tree
Showing 8 changed files with 219 additions and 6 deletions.
27 changes: 23 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ This repository contains modified code of [LVI-SAM](https://github.com/TixiaoSha
<img src="./doc/cover-ljj.gif" alt="drawing" width="800"/>
</p>


---


Expand Down Expand Up @@ -84,10 +85,10 @@ extrinsicTranslation: !!opencv-matrix
```

<p align='center'>
<img src="./doc/official-equipment.png" alt="drawing" width="800"/>
<img src="./doc/official-equipment.png" alt="drawing" width="600"/>
</p>

**However**, for most of the IMUs, the Euler angle coordinate system is same as the angular velocity and angular velocity coordinate system. So the above parameters should be set as follows.
==**Note**==: **For most of the IMUs, the Euler angle coordinate system is same as the angular velocity and angular velocity coordinate system**. So the above parameters should be set as follows.

```yaml
## 对绝大多数IMU来说,下面三个值分别是"+z", "+y", "+x" (for most of IMUs, the following config is "+z", "+y", "+x")
Expand Down Expand Up @@ -155,6 +156,24 @@ extrinsicTranslation: !!opencv-matrix
<img src="./doc/gate_01.png" alt="drawing" width="600"/>
</p>

3. [UrbanNavDataset](https://github.com/weisongwen/UrbanNavDataset)

- Run the launch file:

```
roslaunch lvi_sam UrbanNavDataset.launch
```

- Play existing bag files, the params we provided is for [UrbanNav-HK-Data20200314](https://www.dropbox.com/s/3mtlncglrv7p39l/2020-03-14-16-45-35.bag.tar.gz?dl=0). If you use other bag files of UrbanNavDataset, please check if the params need to be changed.

```
rosbag play 2020-03-14-16-45-35.bag
```

- Results on UrbanNav-HK-Data20200314:

<p align='center'> <img src="./doc/urbannav.gif" alt="drawing" width="600"/></p>

3. [My test dataset](https://1drv.ms/u/s!AqYajE_ft9lwg0paJQu_DRzU-GQ5?e=A95yfn)

- Run the launch file:
Expand All @@ -175,14 +194,14 @@ extrinsicTranslation: !!opencv-matrix
<img src="./doc/backbag.png" alt="drawing" width="600"/>
</p>

- Results of our's modified code on 0117-1525.bag (Due to privacy issues, this data package is not open source):
- Results of our's modified code on our self 0117-1525.bag (Device is not same as backbag.bag, so it has another params. However, sorry for privacy issues, this data package can not open source):

```
roslaunch lvi_sam ljj.launch
rosbag play 0117-1525.bag
```

<p align='center'> <img src="./doc/ljj.gif" alt="drawing" width="800"/></p>
<p align='center'> <img src="./doc/ljj.gif" alt="drawing" width="600"/></p>

---

Expand Down
89 changes: 89 additions & 0 deletions config/UrbanNavDataset_camera.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
%YAML:1.0

# Project
project_name: "lvi_sam"

#common parameters
imu_topic: "/imu/data"
image_topic: "/camera/image_color"
point_cloud_topic: "lvi_sam/lidar/deskew/cloud_deskewed"

# Lidar Params
use_lidar: 1 # whether use depth info from lidar or not
lidar_skip: 3 # skip this amount of scans
align_camera_lidar_estimation: 1 # align camera and lidar estimation for visualization

#camera calibration
model_type: PINHOLE
camera_name: camera
image_width: 1920
image_height: 1200
distortion_parameters:
k1: -0.109203
k2: 0.063536
p1: -0.003427
p2: -0.000629
projection_parameters:
fx: 1086.160899
fy: 1090.242963
cx: 940.067502
cy: 586.740077

#imu parameters The more accurate parameters you provide, the worse performance
acc_n: 8.1330537434371481e-03 # accelerometer measurement noise standard deviation. #0.2 0.04
gyr_n: 7.4266825125507141e-03 # gyroscope measurement noise standard deviation. #0.05 0.004
acc_w: 1.2123362494392119e-04 # accelerometer bias random work noise standard deviation. #0.02
gyr_w: 8.6572985145653080e-05 # gyroscope bias random work noise standard deviation. #4.0e-5
g_norm: 9.78785 # gravity magnitude

# Extrinsic parameter between IMU and Camera.
estimate_extrinsic: 0 # 0 Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
# 1 Have an initial guess about extrinsic parameters. We will optimize around your initial guess.
# 2 Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning.
#Rotation from camera frame to imu frame, imu^R_cam
extrinsicRotation: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [ 9.9934608718980233e-01, -1.5715484428488590e-02, -3.2564114721728155e-02,
3.2359037356803094e-02, -1.3131917124154624e-02, 9.9939003669937865e-01,
-1.6133527815482926e-02, -9.9979026615676858e-01, -1.2614792047622947e-02]

#Translation from camera frame to imu frame, imu^T_cam
extrinsicTranslation: !!opencv-matrix
rows: 3
cols: 1
dt: d
data: [-1.7434527332030676e-02, 1.7171139776467173e-01, -4.5251036141047592e-02]

#feature traker paprameters
max_cnt: 150 # max feature number in feature tracking
min_dist: 20 # min distance between two features
freq: 20 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image
F_threshold: 1.0 # ransac threshold (pixel)
show_track: 1 # publish tracking image as topic
equalize: 1 # if image is too dark or light, trun on equalize to find enough features
fisheye: 0 # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points

#optimization parameters
max_solver_time: 0.035 # max solver itration time (ms), to guarantee real time
max_num_iterations: 10 # max solver itrations, to guarantee real time
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#unsynchronization parameters
estimate_td: 0 # online estimate time offset between camera and imu
td: 0 # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#rolling shutter parameters
rolling_shutter: 0 # 0: global shutter camera, 1: rolling shutter camera
rolling_shutter_tr: 0 # unit: s. rolling shutter read out time per frame (from data sheet).

#loop closure parameters
# NOTE: if start loop_closure for UrbanNav-HK-Data20200314.bag, it will be init failure, so close it.
loop_closure: 0 # start loop closure
skip_time: 0.0
skip_dist: 0.0
debug_image: 0 # save raw image in loop detector for visualization prupose; you can close this function by setting 0
match_image_scale: 0.5
vocabulary_file: "/config/brief_k10L6.bin"
brief_pattern_file: "/config/brief_pattern.yml"
77 changes: 77 additions & 0 deletions config/UrbanNavDataset_lidar.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
# project name
PROJECT_NAME: "lvi_sam"

lvi_sam:

# Topics
pointCloudTopic: "/velodyne_points" # Point cloud data
imuTopic: "/imu/data" # IMU data

# Heading
useImuHeadingInitialization: false # if using GPS data, set to "true"

# Export settings
savePCD: false # https://github.com/TixiaoShan/LIO-SAM/issues/3
savePCDDirectory: "/Downloads/LOAM/" # in your home folder, starts and ends with "/". Warning: the code deletes "LOAM" folder then recreates it. See "mapOptimization" for implementation

# Sensor Settings
N_SCAN: 32 # number of lidar channel (i.e., 16, 32, 64, 128)
Horizon_SCAN: 1800 # lidar horizontal resolution (Velodyne:1800, Ouster:512,1024,2048)
timeField: "time" # point timestamp field, Velodyne - "time", Ouster - "t"
downsampleRate: 1 # default: 1. Downsample your data if too many points. i.e., 16 = 64 / 4, 16 = 16 / 1

# IMU Settings
imuAccNoise: 8.1330537434371481e-03
imuGyrNoise: 7.4266825125507141e-03
imuAccBiasN: 6.4356659353532566e-05
imuGyrBiasN: 3.5640318696367613e-05
imuGravity: 9.78785

###################### extrinsic between IMU and LiDAR ###########################
###################### T_IMU_LiDAR, LiDAR -> IMU ###########################
extrinsicTranslation: [0.0, 0.0, 0]
extrinsicRotation: [ 1, 0, 0,
0, 1, 0,
0, 0, 1]
# 对绝大多数IMU来说,下面三个值分别是"+z", "+y", "+x"
yawAxis: "+z" # 绕着哪个轴逆时针转动,输出yaw角度为正
pitchAxis: "+y" # 绕着哪个轴逆时针转动,输出pitch角度为正
rollAxis: "+x" # 绕着哪个轴逆时针转动,输出roll角度为正

# LOAM feature threshold
edgeThreshold: 1.0
surfThreshold: 0.1
edgeFeatureMinValidNum: 10
surfFeatureMinValidNum: 100

# voxel filter paprams
odometrySurfLeafSize: 0.4 # default: 0.4
mappingCornerLeafSize: 0.2 # default: 0.2
mappingSurfLeafSize: 0.4 # default: 0.4

# robot motion constraint (in case you are using a 2D robot)
z_tollerance: 1000 # meters
rotation_tollerance: 1000 # radians

# CPU Params
numberOfCores: 4 # number of cores for mapping optimization
mappingProcessInterval: 0.15 # seconds, regulate mapping frequency

# Surrounding map
surroundingkeyframeAddingDistThreshold: 1.0 # meters, regulate keyframe adding threshold
surroundingkeyframeAddingAngleThreshold: 0.2 # radians, regulate keyframe adding threshold
surroundingKeyframeDensity: 2.0 # meters, downsample surrounding keyframe poses
surroundingKeyframeSearchRadius: 50.0 # meters, within n meters scan-to-map optimization (when loop closure disabled)

# Loop closure
loopClosureEnableFlag: true
surroundingKeyframeSize: 25 # submap size (when loop closure enabled)
historyKeyframeSearchRadius: 20.0 # meters, key frame that is within n meters from current pose will be considerd for loop closure
historyKeyframeSearchTimeDiff: 30.0 # seconds, key frame that is n seconds older will be considered for loop closure
historyKeyframeSearchNum: 25 # number of hostory key frames will be fused into a submap for loop closure
historyKeyframeFitnessScore: 0.3 # icp threshold, the smaller the better alignment

# Visualization
globalMapVisualizationSearchRadius: 1000.0 # meters, global map visualization radius
globalMapVisualizationPoseDensity: 10.0 # meters, global map visualization keyframe density
globalMapVisualizationLeafSize: 1.0 # meters, global map visualization cloud density
2 changes: 1 addition & 1 deletion config/backbag_camera.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ point_cloud_topic: "lvi_sam/lidar/deskew/cloud_deskewed"
# Lidar Params
use_lidar: 1 # whether use depth info from lidar or not
lidar_skip: 3 # skip this amount of scans
align_camera_lidar_estimation: 0 # align camera and lidar estimation for visualization
align_camera_lidar_estimation: 1 # align camera and lidar estimation for visualization

#camera calibration
model_type: PINHOLE
Expand Down
2 changes: 1 addition & 1 deletion config/ljj_camera.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ point_cloud_topic: "lvi_sam/lidar/deskew/cloud_deskewed"
# Lidar Params
use_lidar: 1 # whether use depth info from lidar or not
lidar_skip: 3 # skip this amount of scans
align_camera_lidar_estimation: 0 # align camera and lidar estimation for visualization
align_camera_lidar_estimation: 1 # align camera and lidar estimation for visualization

#camera calibration
model_type: PINHOLE
Expand Down
Binary file added doc/urbannav.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
12 changes: 12 additions & 0 deletions launch/UrbanNavDataset.launch
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
<launch>

<!--- Robot State TF -->
<include file="$(find lvi_sam)/launch/include/module_robot_state_publisher.launch" />

<!--- Run Rviz-->
<include file="$(find lvi_sam)/launch/include/module_rviz.launch" />

<!--- SAM -->
<include file="$(find lvi_sam)/launch/include/UrbanNavDataset_sam.launch" />

</launch>
16 changes: 16 additions & 0 deletions launch/include/UrbanNavDataset_sam.launch
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
<launch>
<arg name="project" default="lvi_sam" />
<!-- Lidar odometry param -->
<rosparam file="$(find lvi_sam)/config/UrbanNavDataset_lidar.yaml" command="load" />
<!-- VINS config -->
<param name="vins_config_file" type="string" value="$(find lvi_sam)/config/UrbanNavDataset_camera.yaml" />
<!-- Lidar odometry -->
<node pkg="$(arg project)" type="$(arg project)_imuPreintegration" name="$(arg project)_imuPreintegration" output="screen" respawn="true" />
<node pkg="$(arg project)" type="$(arg project)_imageProjection" name="$(arg project)_imageProjection" output="screen" respawn="true" />
<node pkg="$(arg project)" type="$(arg project)_featureExtraction" name="$(arg project)_featureExtraction" output="screen" respawn="true" />
<node pkg="$(arg project)" type="$(arg project)_mapOptmization" name="$(arg project)_mapOptmization" output="screen" respawn="true" />
<!-- Visual feature and odometry -->
<node pkg="$(arg project)" type="$(arg project)_visual_feature" name="$(arg project)_visual_feature" output="screen" respawn="true" />
<node pkg="$(arg project)" type="$(arg project)_visual_odometry" name="$(arg project)_visual_odometry" output="screen" respawn="true" />
<node pkg="$(arg project)" type="$(arg project)_visual_loop" name="$(arg project)_visual_loop" output="screen" respawn="true" />
</launch>

0 comments on commit 6734075

Please sign in to comment.