Skip to content

ICRA-2021/RFSift

 
 

Repository files navigation

RFSift

This is the project page for the paper "Conquering Textureless with RF-referenced Monocular Vision for MAV State Estimation".

RFSift is a new state estimator that conquers the textureless challenge with RF-referenced monocular vision. It achieves centimeter-level accuracy in textureless scenes, e.g., dark corridor, white wall, and solid color floor.

This project page contains:

  • Overview of our proposed method.
  • System implementation and real-world tests.

The proposed method

Our method consists of two components:

  • RF-sifting algorithm that sifts best visual features by leveraging 3D UWB measurements.
  • RF-visual-inertial sensor fusion that achieves robust state estimation by fusing measurements from multiple sensors with complementary advantages.

The system overview is illustrated in the following figure.

System implementation and real-world tests

Implementation

Prerequisites

    cd ~
    git clone https://ceres-solver.googlesource.com/ceres-solver
    sudo apt-get -y install cmake libgoogle-glog-dev libatlas-base-dev libeigen3-dev libsuitesparse-dev
    sudo add-apt-repository ppa:bzindovic/suitesparse-bugfix-1319687
    sudo apt-get update && sudo apt-get install libsuitesparse-dev
    mkdir ceres-bin
    cd ceres-bin
    cmake ../ceres-solver
    make -j3
    sudo make install
  • Install MYNTEYE SDK
    We use the left camera of MYNT EYE S1040-120/Mono for monocular visual sensing and follow the following installation process: MYNT-EYE-S-SDK

  • UWB Node Configuration
    We use Nooploop LinkTrack AoA UWB nodes. The UWB AoAs need to be calibrated once. Please find the software and user manual in Nooploop.

Compile the code

Clone the repository and catkin_make:

    cd ~/catkin_ws/src
    git clone https://github.com/weisgroup/RFSift.git
    cd ../
    catkin_make
    source ~/catkin_ws/devel/setup.bash

Note: If you fail in this step, try to find another computer with clean system or reinstall Ubuntu and ROS)

Configure and run RFSift

Hardware setup
Connect the MYNT EYE camera and UWB tag to your MAV's onboard computer. Connect the UWB anchor to a charger.

Run MYNT EYE SDK

    cd path/to/MYNT-EYE-S-SDK
    source wrappers/ros/devel/setup.bash
    roslaunch mynt_eye_ros_wrapper vins_fusion.launch

Run UWB node

    cd ~/catkin_ws
    source devel/setup.bash
    roslaunch nlink_parser linktrack_aoa.launch

You can view the topic /nlink_linktrack_aoa_nodeframe0 in by rostopic echo /nlink_linktrack_aoa_nodeframe0 in another terminal. If everything goes well, UWB measurements will be printed on the screen.

Run RFSift

    cd ~/catkin_ws
    source devel/setup.bash
    roslaunch vins mynteye-s-mono-imu.launch

Initialization is needed. Move the MAV around at the same height as the UWB anchor. This may take up to a minute. Once it is finished, you can view the MAV and UWB nodes in Rviz.
Note: the MAV should move infront of the UWB anchor, otherwise the AoAs and ranges cannot be measured correctly.

Customization

You can customize your own experiment by editing the config file /config/mynteye-s/mynt_mono_config.yaml. For example, you can set uwb_optimize: 0 to ignore uwb measurements in optimization. For a more detailed description, please refer to the config file and read the comments.

Run on DJI M100

We use djiros and n3ctrl to test RFSift on DJI M100. Some of the codes are modified to work with the N1 flight controller on M100. Please refer to them if you are interested.

Demo video

Sift

Acknowledgement

This code borrows heavily from VINS-Fusion.

License

The source code is released under GPLv3 license.

For any technical issues, please contact Sheyang Tang <sheyangtangATgmail.com> and Shengkai Zhang <shengkai.zhangATgmail.com>

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 96.7%
  • CMake 3.3%