Skip to content

ANASS812/real-time-human-detection-tracking-system

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Smart tracking Camera using Yolo.

Data scientist | Anass MAJJI


🧐 Description

  • In this project, we built a Smart Human Tracking Camera using Arduino and Yolov5 model.

🚀 Repository Structure

The repository contains the following files & directories:

  • .github/workflows : it contains the .yml file which details the instructions of our automated tests and deployment process.

  • images : this folder contains all images used on the README file.

  • src : in this folder we have :

    • app : code of the Fastapi webapp.
    • test : differents Unit tests.
    • yolov5 : DL model used for human detection.
  • requirements.txt: all the packages used in this project.

📈 Demontration

In this section, we are going to demonstrate a walkthrough on building and deployment of a Real-time Human Detection and tracking system using Yolov5 model and Arduino UNO cards. We can split this project into two parts :

1. Software section :

1.1 Fastapi webapp :

Before deploying the model on the Arduino board, we built a Fastapi webapp using HTML, CSS and JS. For the Client / Server connection we used the WebSocket protocol to send the real time Yolov5's output as a streaming video. Bellow is the the home page of the webapp. As we can see, there are two main options :

First option :

It consists in detecting humans from images. The user can upload the image ("Click to upload" buttom) and then click on "Analyze" to get the output of the Yolo model. Once the output image is generated, the user can download it by clicking on "Download".

Bellow, an example of the input and generated image using Yolov5 model.

Second option :

With the second option, we use the Yolov5 model to detect and track humans using camera. The video streaming will start after clicking on "start" button. Here, we have two choices, we can either use a webcom or an external USB camera.

The video streaming is stopped afer clicking on "Stop" button or on "Exit WebCom" button to shut down the WebSocket connection.

1.2 Deployment using CI/CD :

  • CI/CD : Finally, to deploy the project we use CI/CD which is one of the best practices in software development as it reduces the repetitive process of unit tests and the deployment of the software. For that, in src/test_app.py script, we test each function of the Fastapi webapp. All of these processes can be achieved using a new feature on github called github Actions. To create the CI/CD workflow, we need to create a .yml file on .github/workflows in which we have the instructions of our automated tests and deployment process.

2. Hardware section :

2.1 Build a human tracking camera from scratch :

We deploy the Yolov5 model using Arduino UNO card. For that, we need :

  • 2 Servo motors : used for vertical and horizontal rotation with a 120 rotation degree.

  • Arduino UNO card : is a microcontroller board mainly used to interact and controll eletronic objects with an Arduino code.

  • 1080p USB Camera : with a 30 FPS (frame per second)

  • Connecting cables.

  • Camera shell

  • To controll 2 servo-motors using arduino Card, we need first to download and install an Arduino IDE and then upload the Arduino code on the Arduino UNO Card (you can find my code on src/arduino/arduino_2_motors.ino)

Once done, we set up the configuration (shown bellow) to connect all thoses objects mentionned above with the laptop.

2.2 Vertical & horizontal rotations :

In order to track a human using our camera, we need to define some rules based on the position of the person so that it can turn vertically and horizontally. Let's assume that the person is on the left of the camera and the box around him (in red bellow) has coordinates ((X1,Y1) (X2, Y2)). We call it "main_box" :

And then, we compute a centred version of the box, we call it "centred_box" with coordinates ((X1_center, Y1_center) (X2_center, Y2_center)) as shown in the figure bellow :

So for each vertical and horizontal rotation, we have 3 cases depending on the position of the "main_box" relative to its "centred_box" :

As mentioned in the formulas, we defined a 10% tolerance along the X and Y axis to better stabilize the camera.

📈 Performance & results


📪 Contact

For any information, feedback or questions, please contact me

About

Real-time human detection and tracking camera using YOLOV5 and Arduino

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 96.6%
  • Shell 0.9%
  • HTML 0.9%
  • JavaScript 0.6%
  • Dockerfile 0.5%
  • CSS 0.3%
  • C++ 0.2%