Skip to content

Commit

Permalink
Merge pull request stereolabs#553 from stereolabs/update-sample-readmes
Browse files Browse the repository at this point in the history
Update sample readmes
  • Loading branch information
mattrouss authored Apr 24, 2023
2 parents 8d26fda + 901cda7 commit 2e0bb8c
Show file tree
Hide file tree
Showing 36 changed files with 238 additions and 166 deletions.
40 changes: 11 additions & 29 deletions body tracking/README.md
Original file line number Diff line number Diff line change
@@ -1,37 +1,19 @@
# ZED SDK Body Tracking Samples
# Body Tracking Samples

This repository contains samples demonstrating how to use the ZED stereoscopic camera's body tracking features using the ZED SDK. The samples show how to fuse the data from several cameras, export the data in JSON format, and how to integrate the SDK with external tools like Unity and Unreal.
This repository contains samples demonstrating how to use the [ZED](https://www.stereolabs.com/store/) camera's **Body Tracking** features using the ZED SDK. You can find additional information on the Body Tracking module in our [Documentation](https://www.stereolabs.com/docs/body-tracking/) and [API Reference](https://www.stereolabs.com/docs/api/group__Body__group.html).

## Requirements
<p align="center">
<img src="https://user-images.githubusercontent.com/32394882/230631989-24dd2b58-2c85-451b-a4ed-558d74d1b922.gif" />
</p>

ZED Camera
ZED SDK
CUDA 10.2 or later
Python 3.6 or later (for Python samples)
C++11 or later (for C++ samples)
Unity or Unreal Engine (for integration samples)
## Overview

## Installation
This section contains the following code samples:

Install the ZED SDK following the instructions on the official website.
Clone this repository to your local machine.
For Python samples, install the required Python packages by running pip install -r requirements.txt in the root directory of the repository.
Build the C++ samples using the CMake build system. Refer to the individual sample README files for build instructions.
- [Body Tracking](./body%20tracking/): This sample shows how to use the Body Tracking module, using a single camera and a simple 3D display.

## Samples
### Overview
This samples demonstrates how to build simple body tracking app. It provide :
- Single camera body tracking
- Whole hand **fingertracking**
- A simple 3D display
- [Tracking Data Export](./export/): This sample shows how to export **human body tracking data** into a JSON format. You can adapt the code to fit your needs.

### Fusion
This sample demonstrate how to fuse the Body tracking data from several camera to track an entire space with a much higher quality. That can be done on one single machine, or on a network.
The camera must be calibrated first with ZED 360.
If your camera are distributed over a local network, you'll need to use ZED Hub. [Subscribe for free.](https://hub.stereolabs.com)
- [Integrations](./integrations) This folder contains links to other repositories that provide Body Tracking **integration examples** and tutorials with Unreal Engine 5, Unity, and Livelink.

### Export
This sample shows you how to export the data into a JSON format. you can adapt the code to fit your needs.

### Integrations
This folder contains links to other repositories that provide Body Tracking integrations examples and tutorials with Unreal, Unity, Livelink.
- [Multi Camera Fusion](./multi-camera): This sample demonstrates how to use the ZED SDK **Fusion API** to track people in an entire space, with data from multiple cameras which produces higher quality results than using a single camera. The sample goes through the full process of setting up your cameras, calibrating your system with ZED360, fusing and visualizing the data.
4 changes: 2 additions & 2 deletions body tracking/body tracking/cpp/src/main.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -77,9 +77,9 @@ int main(int argc, char **argv) {

// Enable the Body tracking module
BodyTrackingParameters body_tracker_params;
body_tracker_params.enable_tracking = false; // track people across images flow
body_tracker_params.enable_tracking = true; // track people across images flow
body_tracker_params.enable_body_fitting = false; // smooth skeletons moves
body_tracker_params.body_format = sl::BODY_FORMAT::BODY_38;
body_tracker_params.body_format = sl::BODY_FORMAT::BODY_18;
body_tracker_params.detection_model = isJetson ? BODY_TRACKING_MODEL::HUMAN_BODY_FAST : BODY_TRACKING_MODEL::HUMAN_BODY_ACCURATE;
//body_tracker_params.allow_reduced_precision_inference = true;

Expand Down
2 changes: 1 addition & 1 deletion body tracking/body tracking/csharp/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ set_property(TARGET ${PROJECT_NAME} PROPERTY VS_DOTNET_REFERENCES

set(CMAKE_SUPPRESS_REGENERATION true)
set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES
"Stereolabs.zed_3.7.*"
"Stereolabs.zed_4.*"
"OpenGL.Net_0.8.4"
"OpenGL.Net.CoreUI_0.8.4"
"OpenCvSharp4.Windows_4.5.0.20201013"
Expand Down
25 changes: 10 additions & 15 deletions body tracking/integrations/README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,14 @@
## Using body tracking with external softwares
# Using Body Tracking with external software

We provide integrations of most ZED Features, and especially Body Tracking, for several tools/
The **Body Tracking module** of the ZED SDK provides real-time human body tracking capabilities, allowing developers to create immersive applications that can track and interact with human body movements in 3D space. The ZED SDK provides integrations for popular game engines such as **Unity** and **Unreal Engine 5**, as well as additional integrations with **Unity using LiveLink** and **Unreal Engine with LiveLink**.

### Unity
The unity integration can be found [in this repository](https://github.com/stereolabs/zed-unity)
It allows you to use all ZED Features into Unity. Body tracking samples and tutorials are provided.
Each integration is available in its own repository, containing samples and tutorials to get started integrating the ZED SDK into your new or existing project.

### UnrealEngine
The Unreal Engine integration can be found [in this repository](https://github.com/stereolabs/zed-unreal-plugin)
It allows you to use all ZED Features into Unreal Engine. Body tracking samples and tutorials are provided.
## Integrations Github Repositories

### Unreal LiveLink
The LiveLink integration can be found [in this repository](https://github.com/stereolabs/zed-livelink-plugin)
It allows you to stream Body Tracking data from the ZED into LiveLink. You can then retrieve the data in UnrealEngine.

### Unity Livelink
The Unity LiveLink integration can be found [in this repository](https://github.com/stereolabs/zed-unity-livelink)
It allows you to stream Body Tracking data from the ZED into LiveLink. You can then retrieve the data in Unity.
| Integration | Description |
| :------: | ------ |
| [Unity](https://github.com/stereolabs/zed-unity) | Seamless integration of body tracking capabilities into Unity projects for animation and interactive experiences |
| [Unity with Livelink](https://github.com/stereolabs/zed-unity-livelink) | Real-time streaming of body tracking data directly into Unity for animation and interactive experiences |
| [Unreal Engine 5](https://github.com/stereolabs/zed-UE5) | Incorporate ZED SDK real-time human body tracking data into Unreal Engine 5 seamlessly |
| [Unreal Engine 5 with Livelink](https://github.com/stereolabs/zed-UE5) | Real-time streaming of body tracking data directly into Unreal Engine 5 for animation and interactive experiences |
36 changes: 21 additions & 15 deletions camera control/README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,25 @@
# ZED SDK - Camera Control

## This sample shows how to capture images with the ZED SDK and adjust camera settings.
This sample shows how to capture images with the ZED SDK video module and adjust camera settings. You can find additional information on the Video module in our [Documentation](https://www.stereolabs.com/docs/video/camera-controls/) and [API Reference](https://www.stereolabs.com/docs/api/group__Video__group.html).

### Features
- Camera images are displayed on an OpenCV windows
<p align="center">
<img src="https://user-images.githubusercontent.com/32394882/230602616-6b57c351-09c4-4aba-bdec-842afcc3b2ea.gif" />
</p>


## Overview

This repository demonstrates how to change the **ZED camera's video settings** and display the camera's image using OpenCV windows. The sample provides examples of how to change the following settings:

- Brightness
- Contrast
- Saturation
- Hue
- Sharpness
- Gamma
- Gain
- Exposure
- Automatic GAIN_EXP behavior
- White balance
- LED state

The following parameters can be changed:
- Brightness
- Contrast
- Saturation
- Hue
- Sharpness
- Gamma
- Gain
- Exposure
- Automatic GAIN_EXP behavior
- White Balance
- LED state
2 changes: 1 addition & 1 deletion camera control/csharp/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,6 @@ set_property(TARGET ${PROJECT_NAME} PROPERTY VS_DOTNET_REFERENCES

set(CMAKE_SUPPRESS_REGENERATION true)
set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES
"Stereolabs.zed_3.7.*"
"Stereolabs.zed_4.*"
"OpenCvSharp4.Windows_4.5.0.20201013"
)
14 changes: 11 additions & 3 deletions camera streaming/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,12 @@
# ZED SDK - Streaming
# Streaming

This sample shows how to create a **video stream** from a ZED camera which can be read by a remote application for viewing or further processing. You can find additional information on the Video Streaming module in our [Documentation](https://www.stereolabs.com/docs/video/streaming/) and [API Reference](https://www.stereolabs.com/docs/api/structsl_1_1StreamingParameters.html).

## Overview

This repository contains the following code samples:

- [ZED Stream Sender](./sender): This sample demonstrates how to use the ZED SDK to establish a network connection, encode a live video stream, and transmit it to a remote client. The ZED SDK handles all aspects of network communication, video encoding, and transmission, making it easy to integrate the ZED camera into applications that require **remote video capture and processing**, such as computer vision, remote monitoring, or teleoperation.

- [ZED Stream Receiver](./receiver/): This sample demonstrates how to use the ZED SDK to **receive and decode** video data sent from a remote ZED stream.

- **Sender**: physically open the camera and broadcasts its images on the network.
- **Reciever**: Connects to a broadcasting device to get the ZED images and process them.
18 changes: 17 additions & 1 deletion depth sensing/README.md
Original file line number Diff line number Diff line change
@@ -1 +1,17 @@
# Depth Sensing Samples
# Depth Sensing Samples

This repository contains samples demonstrating how to use the [ZED]("https://www.stereolabs.com/store/") camera's **Depth Sensing features** using the ZED SDK. You can find additional information on the Depth Sensing module in our [Documentation](https://www.stereolabs.com/docs/depth-sensing/) and [API Reference](https://www.stereolabs.com/docs/api/group__Depth__group.html).

<p align="center">
<img src="https://user-images.githubusercontent.com/32394882/230639409-356b8dfa-df66-4bc2-84d8-a25fd0229779.gif" />
</p>

## Overview

This section contains the following code samples:

- [Depth Sensing sample](./depth%20sensing): This sample demonstrates how to extract depth information from a single ZED camera and visualize it in an OpenGL window.
- [Depth Sensing with multiple cameras](./multi%20camera): This sample provides an example of how to design an app that use **multiple ZED cameras** in separated threads and displays RGB and depth maps in OpenCV.
- [Region Of Interest](./region%20of%20interest) This sample showcases how to define a Region of Interest (ROI), pixels outside of this area are discard from the all modules (depth, positional tracking, detections ...).
- [Image Refocus](./image%20refocus) This sample illustrates how to apply depth-dependent blur to an image, allowing users to adjust the focal point after the image has been taken.

2 changes: 1 addition & 1 deletion depth sensing/depth sensing/csharp/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ set(CMAKE_SUPPRESS_REGENERATION true)
set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES
"OpenGL.Net_0.8.4"
"OpenGL.Net.CoreUI_0.8.4"
"Stereolabs.zed_3.7.*"
"Stereolabs.zed_4.*"
)


5 changes: 2 additions & 3 deletions depth sensing/image refocus/cpp/src/main.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -140,10 +140,9 @@ int main(int argc, char **argv) {
glewInit();

InitParameters init_parameters;
init_parameters.depth_mode = DEPTH_MODE::ULTRA;
init_parameters.camera_resolution = RESOLUTION::AUTO;
init_parameters.depth_mode = DEPTH_MODE::NEURAL;
init_parameters.coordinate_units = UNIT::MILLIMETER;
init_parameters.depth_minimum_distance = 400.0f;
init_parameters.depth_minimum_distance = 100.;

// Open the camera
ERROR_CODE zed_open_state = zed.open(init_parameters);
Expand Down
26 changes: 23 additions & 3 deletions fusion/README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,27 @@
# Fusion Samples

The fusion API is designed to aggregate the data coming from multiple camera to improve them.
The ZED SDK's Fusion API is designed to combine data from multiple cameras, resulting in higher quality data. The API can fuse data from several cameras to improve the accuracy and robustness of tracking systems.

____________
For instance, the Fusion API can be used in outdoor robot tracking with GNSS to provide real-time fusion of the 3D position and orientation of the robot, even in challenging environments. Additionally, the API can be used with the ZED Camera's body tracking feature to fuse data from multiple cameras to track an entire space with much higher quality. This capability enables a range of applications that require accurate spatial tracking, such as robotics, autonomous vehicles, augmented reality, and virtual reality.

[Here](/body%20tracking/multi-camera/) , you can find a sample that shows how to combine multiple body detections from an array of camera to create a better representation.

## Overview

This section lists the available modules available in the **Fusion API**. It provides a convenient way to discover and access additional resources related to the Fusion API, including examples, tutorials, and integrations with other software platforms. These resources can be used to further explore the capabilities of the Fusion API and to build more sophisticated applications that leverage the data fusion capabilities of the ZED Camera.

## Body tracking

<p align="center">
<img src="https://user-images.githubusercontent.com/32394882/230631989-24dd2b58-2c85-451b-a4ed-558d74d1b922.gif" />
</p>


The [Multi camera Body Tracking sample](/body%20tracking/multi-camera/) demonstrates how to combine multiple body detections from an array of cameras to create a more accurate and robust representation of the detected bodies. By fusing data from multiple cameras, the sample can improve the accuracy and robustness of the body tracking system, especially in challenging environments with occlusions or complex motions. The sample showcases the capabilities of the ZED SDK's Fusion API and provides a starting point for building more sophisticated applications that require multi-camera body tracking.

## GeoTracking

<p align="center">
<img src="https://user-images.githubusercontent.com/32394882/230602944-ed61e6dd-e485-4911-8a4c-d6c9e4fab0fd.gif" />
</p>

The [GeoTracking sample](/geotracking/) demonstrates how to combine data from the ZED Camera and a Global Navigation Satellite System (GNSS) receiver for outdoor tracking applications. The sample showcases the Fusion API of the ZED SDK and provides an example of how to use it to integrate data from multiple sources, such as the camera and GNSS receiver. By fusing data from these sources, the sample can improve the accuracy and robustness of the tracking system, especially in challenging outdoor environments. The sample provides a starting point for building more sophisticated applications that require outdoor tracking with the ZED Camera and GNSS.
26 changes: 15 additions & 11 deletions geotracking/README.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,20 @@
# ZED SDK - Geotracking for global scale localization on real-world map
# Geotracking

## Sample Structure:
The samples provided with the geotracking API are organized as follows:
These samples show how to use the ZED SDK Geotracking module for **global scale localization on a real-world map**.

### Recording Sample
The Recording sample demonstrates how to record data from both a ZED camera and an external GNSS sensor. The recorded data is saved in an SVO file and a JSON file, respectively. This sample provides the necessary data for the Playback sample.
<p align="center">
<img src="https://user-images.githubusercontent.com/32394882/230602944-ed61e6dd-e485-4911-8a4c-d6c9e4fab0fd.gif" />
</p>

## Overview

The samples provided using the Geotracking API are organized as follows:

- [Live Geotracking](./live%20geotracking/) The Live Geotracking sample demonstrates how to use the Geotracking API using both the ZED camera and an external GNSS sensor. It displays the corrected positional tracking in the ZED reference frame on an OpenGL window and the geo-position on a real-world map in ZEDHub.

- [Geotracking Data Recording](./recording/): The Recording sample demonstrates how to **record data** from both a ZED camera and an external GNSS sensor. The recorded data is saved in an SVO file and a JSON file, respectively. This sample provides the necessary data to be used by the Playback sample.

- [Geotracking Data Playback](./playback/): The Playback sample shows how to use the geotracking API for global scale localization on a real-world map. It takes the data generated by the Recording sample and uses it to display geo-positions on a real-world map.

### Playback Sample
The Playback sample shows how to use the geotracking API for global scale localization on a real-world map. It takes the data generated by the Recording sample and uses it to display geo-positions on a real-world map.

### GeoTracking Sample
The GeoTracking sample demonstrates how to use the geotracking API for global scale localization on a real-world map using both the ZED camera and an external GNSS sensor. It displays the corrected positional tracking in the ZED reference frame on an OpenGL window. It also displays the geo-position on a real-world map on [ZED Hub](https://hub.stereolabs.com).

By utilizing these samples, developers can quickly and easily incorporate geotracking capabilities into their projects using the ZED SDK.
Developers can take advantage of the benefits of the ZED SDK to quickly and easily incorporate geotracking capabilities into their projects.
Loading

0 comments on commit 2e0bb8c

Please sign in to comment.