Jingao Xu, Danyang Li, Zheng Yang, Yishujie Zhao, Hao Cao, Yunhao Liu, Longfei Shangguan. 2023. Taming Event Cameras with Bio-Inspired Architecture and Algorithm: A Case for Drone Obstacle Avoidance. In the 29th Annual International Conference on Mobile Computing and Networking (ACM MobiCom β23)
If you use BioDrone in academic work, please cite:
@inproceedings{xu2023biodrone,
title={Taming Event Cameras with Bio-Inspired Architecture and Algorithm: A Case for Drone Obstacle Avoidance},
author={Xu, Jingao and Li, Danyang and Yang, Zheng and Zhao, Yishujie and Cao, Hao and Liu, Yunhao and Shangguan, Longfei},
booktitle={29th Annual International Conference on Mobile Computing and Networking (ACM MobiCom)},
year={2023}
}
(π‘A brief introduction to event camera has already been provided in our submission)
Event camera, a cutting-edge biosensor, operates via a unique mechanism distinct from traditional frame-based cameras. It boasts intelligent pixels that bear resemblance to the photoreceptor cells found in biological retinas, each capable of triggering events independently. Unlike conventional cameras that capture images at predetermined intervals, this advanced sensor detects per-pixel luminosity alterations asynchronously, producing a continuous stream of events with a resolution of microsecond.
The working principle of an event camera is illustrated in the above figure (left).
As seen, let
The above figure (right) provides a comparative analysis between an event camera and a frame-based camera. Evidently, the frame-based camera captures frames at a consistent rate, resulting in apparent motion blur. In contrast, the event camera operates continuously, producing a spatiotemporal spiral of polarity changes in brightness.
The above figure presents the implementation of BioDrone on a drone platform for obstacle avoidance. The arrows indicate data flow.
We deploy BioDrone on an AMOVLAB P450-NX drone testbed. The drone is equipped with two on-board computational units:
- a Qualcomm Snapdragon Flight, which is leveraged for monocular visual-inertial odometry (VIO) based pose estimation using the provided Machine Vision SDK and received GPS signals.
- a Xilinx Zynq-7020 chip or an Nvidia Jetson TX2 (accompanied with an AUVIDEA J90 carrier board) runs our obstacle detection and localization software stack, as well as an avoidance command planning task.
The output of BioDrone is a low-level obstacle orientation and distance information, based on which a ArduPilot Mega 2.6 (APM) filght controller will calculate and produce single-rotor commands and feed them to motor controllers.
The drone is equipped with two front-facing DAVIS-346 event cameras, in a horizontal stereo setup, connected via USB 3.0 micro to the Jetson TX2.
The DAVIS-346 sensor provides both frame and events and has a (346
We develop the software stack in C++ and leverage Robot Operating System (ROS Melodic) for communication among different modules. We use the open source event camera driver to stream events from camera to Jetson TX2, and the open source avoidance algorithm to plan avoidance commands based on obstacle's location. To reduce latency, we implemented the obstacle localization and avoidance algorithms within the same ROS module, so that no message exchange is necessary between the camera drivers and the position controller. The output of the module is a velocity command, which is then fed to the APM flight controller. And the APM translates it to low-level motor commands using PID-like algorithm and finally communicates with the electronic speed controllers to generate the single rotor thrusts.
We have tested BioDrone under Ubuntu 18.04 with ROS Melodic, using DAVIS346. We recommend using a powerful computer to ensure the performance.
- ROS: operation platform for BioDrone
- rpg_dvs_ros: camera driver and utilities
- Eigen3: camera-related transformation and computation
- OpenCV (3.4.16 tested)
- Prepare a catkin workspace
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
catkin_init_workspace
- Clone the repository to the
src
folder in the workspace
git clone https://github.com/Event4Drone/BioDrone
- Build the project
cd ~/catkin_ws
catkin_make
BioDrone needs your own configuration of camera intrinsics and transformation matrices.
- First locate to the folder of
CameraConfig.h
and open it.
cd BioDrone/include/BioDrone
vim CameraConfig.h
- Configure your camera. Locate the following constants within the config file, and replace them with your camera setup:
const Eigen::Matrix3f Cam::K
: camera intrinsic matrixconst Eigen::Matrix3f Cam::T_ci
: transformation matrix from the IMU reference system to the camera reference system.const double Cam::baseline
: the baseline distance for the stereo cameras.
- Recompile the code to save configuration.
cd ~/catkin_ws
catkin_make
BioDrone subscribes to following ROS topics to receive data, with a publish rate of 1,000 Hz:
/davis/left/events
: The event stream generated from the left camera./davis/right/events
: The event stream generated from the right camera./davis/left/imu
: The IMU data stream generated from the left camera./davis/left/imu
: The IMU data stream generated from the right camera.
You can use ROS utilities(e.g. rosbag record) record a dataset that contains above data.
If the topic names in the recorded dataset don't correspond with above topic names, or the publish rate is not 1,000 Hz, you can manually correct them following this documentation.
BioDrone currently supports dataset playback under ROS platform.
- Check and verify dataset information.
rosbag info your_bag_file.bag
Executing above commandline instruction will display the timestamps, topics and number of collected data, so you can verify if they conform to above requirements. Also, note down the beginning timestamp of the bag file, which will be needed as an argument later.
- Start ROS core and BioDrone.
roscore
rosrun BioDrone main TIMESTAMP
Replace TIMESTAMP
with the number you've just noted down. You might need to execute above instructions in two separate shells.
- Dataset playback. In another separate shell, execute:
rosbag play your_bag_file.bag
This will cause BioDrone to receive data from this bag file and start computation. The localization results will output to shell.