Experience the seamless integration of cooperative perception and realistic communication models in autonomous driving simulation.
Looking for an advanced autonomous driving platform that seamlessly simulates cooperative perception and realistic communication models? Choose EI-Drive!
-
Integrating cooperative perception with realistic communication models, EI-Drive allows the exploration how communication latency and errors affect not only cooperative perception but also the overall performance of autonomous vehicles.
-
A streamlined implementation with a AD pipeline and built-in scenarios. Highly customizable components allow you to tailor your experiments.
-
Clear visualization makes the results highly visible and easy to interpret.
An open-source platform provides solution for joint simulation of realistic communication models and cooperative perception, aimed at safe and efficient cooperative driving automation.
- 🌞 AD pipeline: An AD pipeline encompassing environment, sensing, perception, planning, and control.
- 📸 Cooperative perception: Flexible cooperative perception with customizable agents, methods, tasks, and visualization.
- 📡 Realistic communication models: Simulate the key characteristics, communication latency and errors, in data communication between the edge agents, interacting seamlessly with the perception module.
Documentation: EI-Drive API Documents (coming soon)
Looking for more techincal details? Check our report here! Paper link
EI-Drive has various built-in scenarios tailored for cooperative perception experiments, where the spectator vehicles and RSUs share perception information with the ego vehicle. Multiple cooperative perception tasks, including collision avoidance and traffic flow detection, enable extensive research with different goals.
The perception results have significant influences on the ego vehicle's behavior. The ego vehicle benefits from cooperative perception with less blind spots (rows 1 & 2) and wider detection range (row 3).
Ego vehicle |
➕ |
Spectator vehicle |
➡️ |
Cooperative perception |
Ego vehicle |
➕ |
RSU |
➡️ |
Cooperative perception |
Ego vehicle |
➕ |
RSU |
➡️ |
Cooperative perception |
EI-Drive has great flexibility to simply apply communication latency and errors to any perception processes, allowing the researches across both communication and autonomous driving. The communication latency and errors not only impair the performance of cooperative perception, but also have negtive influence to the behavior of the ego vehicle.
Cooperative perception ✅ |
Cooperative perception + communication errors ❌ |
Cooperative perception + communication latency ❌ |
Cooperative perception |
Cooperative perception + communication errors |
Cooperative perception + communication latency |
EI-Drive supports multiple perception methods and multi-modal sensor inputs, greatly enriches the experiment settings.
Multi-modal inputs |
Multiple perception methods |
First, we need to install CARLA. Download CARLA release of version 0.9.14
as we experiemented with this version.
Set the following environment variables:
export CARLA_ROOT=/path/to/carla
export PYTHONPATH="$CARLA_ROOT/PythonAPI/carla/":"$CARLA_ROOT/PythonAPI/carla/dist/carla-0.9.14-py3.7-linux-x86_64.egg":${PYTHONPATH}
To verify that CARLA has been correctly installed, run the command:
cd carla/
./CarlaUE4.sh
Second, setup the environment for EI-Drive. Clone the repository:
git clone https://github.com/ucd-dare/EI-Drive
cd EI-Drive
Create the EI-Drive environment using conda:
conda env create -f environment.yml
conda activate EI-Drive
cd EI-Drive
python setup.py develop
.setup.sh
To run EI-Drive, make sure the CARLA is running in the meanwhile. You may use two terminals:
cd carla/
./CarlaUE4.sh
cd EI-Drive/
python EI_Drive.py
To run a specific scenario, use
python EI_Drive.py test_scenario=coop_perception_1
The command runs the script coop_perception_1.py
, following the configuration file in the folder scenario_testing/config_yaml/config.yaml
. The config files in EI-Drive are structured hierarchically:
config.yaml/
├── test_scenario/ (Designate the scenario)
| ├── common_params
| ├── vehicle_perception (Perception method)
| ├── vehicle_localization (Localization method)
| ├── game_map
| ├── behavior
| ├── controller
| ├── traffic_manager
| └── scenario
└── world/ (Designate the weather)
├── sunny.yaml
└── ...
The default perception method is oracle with blue bounding boxes, to enable object detection by YOLOv5
python EI_Drive.py test_scenario=coop_perception_1 test_scenario.vehicle_perception.perception.activate=true test_scenario.vehicle_perception.perception.model=yolo
To simplify the usage of lengthy commands, we have packaged common configurations as modules. It is recommended to utilize these modules in the configuration file for specific scenarios. For instance, to achieve the same outcome as the command mentioned above, you can set vehicle_perception: perception_true
in the config file coop_perception_1.yaml
, where config module perception_true.yaml
is applied.
To enable cooperative perception, open the config file in test_scenario/scenario/coop_perception_1.yaml
that defines the details of the scenario. Set coop_perception: true
for the ego vehicle with id=0
and the participant (RSU in this scenario) with id=-1
. To disable it, set coop_perception: false
for the ego vehicle.
Then run the simulation
python EI_Drive.py test_scenario=coop_perception_1
💡 Please note that this config file is a different file with test_scenario/coop_perception_1.yaml
abovementioned, even they have the same name.
To enable latency and errors, open the config file in test_scenario/scenario/coop_perception_1.yaml
that defines the details of the scenario. Set transmission_latency: true
and errors: true
for the ego vehicle with id=0
and the participant (RSU in this scenario) with id=-1
.
Then run the simulation
python EI_Drive.py test_scenario=coop_perception_1
💡 Please ensure coop_perception: true
has been set for the RSU and the ego vehicle, since the latency and errors work when the data communication exists.
If you find this repository useful, please cite this paper:
@article{
}
Special thanks to the community for your valuable contributions and support in making EI-Drive better for everyone!
Hanchu Zhou |
GaoDechen |