Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
ansfl authored Nov 26, 2021
1 parent 7845ade commit 1db91a0
Showing 1 changed file with 21 additions and 21 deletions.
42 changes: 21 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,30 +1,30 @@
# Introduction

This repository describes and presented the datasets associated with our paper "The Autonomous Platform Inertial Dataset".\
Paper: Available soon.\
Datasets: Available soon. \
Video: Available soon\
**Paper**: Available soon.\
**Datasets**: Available soon. \
**Video**: Available soon.

# The Autonomous Platform Inertial Dataset
One of the critical tasks required for fully autonomous functionality is the ability to achieve
an accurate navigation solution; that is, to determine the platform position, velocity, and orientation.
Various sensors, depending on the vehicle environment (air, sea, or land), are employed to achieve this
goal. In parallel to the development of novel navigation and sensor fusion algorithms, machine-learning
based algorithms are penetrating into the navigation and sensor fusion fields. An excellent example for
this trend is pedestrian dead reckoning, used for indoor navigation, where both classical and machine
learning approaches are used to improve the navigation accuracy. To facilitate machine learning algorithms’
derivation and validation for autonomous platforms, a huge quantity of recorded sensor data is needed.
Unfortunately, in many situations, such datasets are not easy to collect or are not publicly available. To
advance the development of accurate autonomous navigation, this paper presents the autonomous platforms
inertial dataset. It contains inertial sensor raw data and corresponding ground truth trajectories. The dataset
was collected using a variety of platforms including a quadrotor, two autonomous underwater vehicles,
a land vehicle, a remote controlled electric car, and a boat. A total of 805.5 minutes of recordings were
made using different types of inertial sensors, global navigation satellite system receivers, and Doppler
velocity logs. After describing the sensors that were employed for the recordings, a detailed description of
the conducted experiments is provided.
One of the critical tasks required for fully autonomous functionality is the ability to achieve an accurate navigation solution; that is, to determine the platform position, velocity, and orientation. Various sensors, depending on the vehicle environment (air, sea, or land), are employed to achieve this goal. In parallel to the development of novel navigation and sensor fusion algorithms, machine-learning based algorithms are penetrating into the navigation and sensor fusion fields. An excellent example for this trend is pedestrian dead reckoning, used for indoor navigation, where both classical and machine learning approaches are used to improve the navigation accuracy. To facilitate machine learning algorithms’ derivation and validation for autonomous platforms, a huge quantity of recorded sensor data is needed.
Unfortunately, in many situations, such datasets are not easy to collect or are not publicly available.\
To advance the development of accurate autonomous navigation, we presents the autonomous platforms inertial dataset. It contains inertial sensor raw data and corresponding ground truth trajectories. The dataset was collected using a variety of platforms including a quadrotor, two autonomous underwater vehicles, a land vehicle, a remote controlled electric car, and a boat. A total of 805.5 minutes of recordings were made using different types of inertial sensors, global navigation satellite system receivers, and Doppler
velocity logs.

![plot](./images/ship3.JPG)
# Our Platforms
![Picture1](https://user-images.githubusercontent.com/93155156/143598729-49d08d5b-3712-4dd3-bdfe-7eb3508fc83c.png)
Snapir - autonomous underwater vehicle\
Alice - autonomous underwater vehicle\
Suzuki Jimny\
Shikmona - marine vessel\
DJI Matrice 300 RTK \
Electric 4WD climbing car

# A Summarizing table of the recordings
# Summary of Collected Datasets
![plot](./images/table.JPG)

# The Autonomous Navigation and Sensor Fusion Lab
The Autonomous Navigation and Sensor Fusion Lab (ANSFL) researches questions and challenges in the fields of autonomous navigation, inertial navigation systems, and estimation theory, as well as in related fields. Our research goals are to devise innovative inertial and sensor fusion algorithms, to develop novel inertial navigation system architectures, and to pioneer deep-learning-based navigation approaches.\
**Lab website**: http://marsci.haifa.ac.il/labs/ansfl/
![Picture1](https://user-images.githubusercontent.com/93155156/143600162-787b7824-a863-46e2-ac19-ad6292a7c006.png)

0 comments on commit 1db91a0

Please sign in to comment.