diff --git a/body tracking/README.md b/body tracking/README.md index 17b639d4..ea2d5317 100644 --- a/body tracking/README.md +++ b/body tracking/README.md @@ -1,37 +1,19 @@ -# ZED SDK Body Tracking Samples +# Body Tracking Samples -This repository contains samples demonstrating how to use the ZED stereoscopic camera's body tracking features using the ZED SDK. The samples show how to fuse the data from several cameras, export the data in JSON format, and how to integrate the SDK with external tools like Unity and Unreal. +This repository contains samples demonstrating how to use the [ZED](https://www.stereolabs.com/store/) camera's **Body Tracking** features using the ZED SDK. You can find additional information on the Body Tracking module in our [Documentation](https://www.stereolabs.com/docs/body-tracking/) and [API Reference](https://www.stereolabs.com/docs/api/group__Body__group.html). -## Requirements +

+ +

- ZED Camera - ZED SDK - CUDA 10.2 or later - Python 3.6 or later (for Python samples) - C++11 or later (for C++ samples) - Unity or Unreal Engine (for integration samples) +## Overview -## Installation +This section contains the following code samples: - Install the ZED SDK following the instructions on the official website. - Clone this repository to your local machine. - For Python samples, install the required Python packages by running pip install -r requirements.txt in the root directory of the repository. - Build the C++ samples using the CMake build system. Refer to the individual sample README files for build instructions. +- [Body Tracking](./body%20tracking/): This sample shows how to use the Body Tracking module, using a single camera and a simple 3D display. -## Samples -### Overview -This samples demonstrates how to build simple body tracking app. It provide : -- Single camera body tracking -- Whole hand **fingertracking** -- A simple 3D display +- [Tracking Data Export](./export/): This sample shows how to export **human body tracking data** into a JSON format. You can adapt the code to fit your needs. -### Fusion -This sample demonstrate how to fuse the Body tracking data from several camera to track an entire space with a much higher quality. That can be done on one single machine, or on a network. -The camera must be calibrated first with ZED 360. -If your camera are distributed over a local network, you'll need to use ZED Hub. [Subscribe for free.](https://hub.stereolabs.com) +- [Integrations](./integrations) This folder contains links to other repositories that provide Body Tracking **integration examples** and tutorials with Unreal Engine 5, Unity, and Livelink. -### Export -This sample shows you how to export the data into a JSON format. you can adapt the code to fit your needs. - -### Integrations -This folder contains links to other repositories that provide Body Tracking integrations examples and tutorials with Unreal, Unity, Livelink. \ No newline at end of file +- [Multi Camera Fusion](./multi-camera): This sample demonstrates how to use the ZED SDK **Fusion API** to track people in an entire space, with data from multiple cameras which produces higher quality results than using a single camera. The sample goes through the full process of setting up your cameras, calibrating your system with ZED360, fusing and visualizing the data. diff --git a/body tracking/body tracking/cpp/src/main.cpp b/body tracking/body tracking/cpp/src/main.cpp index 6e41fe27..559b21f7 100644 --- a/body tracking/body tracking/cpp/src/main.cpp +++ b/body tracking/body tracking/cpp/src/main.cpp @@ -77,9 +77,9 @@ int main(int argc, char **argv) { // Enable the Body tracking module BodyTrackingParameters body_tracker_params; - body_tracker_params.enable_tracking = false; // track people across images flow + body_tracker_params.enable_tracking = true; // track people across images flow body_tracker_params.enable_body_fitting = false; // smooth skeletons moves - body_tracker_params.body_format = sl::BODY_FORMAT::BODY_38; + body_tracker_params.body_format = sl::BODY_FORMAT::BODY_18; body_tracker_params.detection_model = isJetson ? BODY_TRACKING_MODEL::HUMAN_BODY_FAST : BODY_TRACKING_MODEL::HUMAN_BODY_ACCURATE; //body_tracker_params.allow_reduced_precision_inference = true; diff --git a/body tracking/body tracking/csharp/CMakeLists.txt b/body tracking/body tracking/csharp/CMakeLists.txt index 324e8aca..733d29aa 100644 --- a/body tracking/body tracking/csharp/CMakeLists.txt +++ b/body tracking/body tracking/csharp/CMakeLists.txt @@ -38,7 +38,7 @@ set_property(TARGET ${PROJECT_NAME} PROPERTY VS_DOTNET_REFERENCES set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" "OpenGL.Net_0.8.4" "OpenGL.Net.CoreUI_0.8.4" "OpenCvSharp4.Windows_4.5.0.20201013" diff --git a/body tracking/integrations/README.md b/body tracking/integrations/README.md index cf61d8df..64295638 100644 --- a/body tracking/integrations/README.md +++ b/body tracking/integrations/README.md @@ -1,19 +1,14 @@ -## Using body tracking with external softwares +# Using Body Tracking with external software -We provide integrations of most ZED Features, and especially Body Tracking, for several tools/ +The **Body Tracking module** of the ZED SDK provides real-time human body tracking capabilities, allowing developers to create immersive applications that can track and interact with human body movements in 3D space. The ZED SDK provides integrations for popular game engines such as **Unity** and **Unreal Engine 5**, as well as additional integrations with **Unity using LiveLink** and **Unreal Engine with LiveLink**. -### Unity -The unity integration can be found [in this repository](https://github.com/stereolabs/zed-unity) -It allows you to use all ZED Features into Unity. Body tracking samples and tutorials are provided. +Each integration is available in its own repository, containing samples and tutorials to get started integrating the ZED SDK into your new or existing project. -### UnrealEngine -The Unreal Engine integration can be found [in this repository](https://github.com/stereolabs/zed-unreal-plugin) -It allows you to use all ZED Features into Unreal Engine. Body tracking samples and tutorials are provided. +## Integrations Github Repositories -### Unreal LiveLink -The LiveLink integration can be found [in this repository](https://github.com/stereolabs/zed-livelink-plugin) -It allows you to stream Body Tracking data from the ZED into LiveLink. You can then retrieve the data in UnrealEngine. - -### Unity Livelink -The Unity LiveLink integration can be found [in this repository](https://github.com/stereolabs/zed-unity-livelink) -It allows you to stream Body Tracking data from the ZED into LiveLink. You can then retrieve the data in Unity. \ No newline at end of file +| Integration | Description | +| :------: | ------ | +| [Unity](https://github.com/stereolabs/zed-unity) | Seamless integration of body tracking capabilities into Unity projects for animation and interactive experiences | +| [Unity with Livelink](https://github.com/stereolabs/zed-unity-livelink) | Real-time streaming of body tracking data directly into Unity for animation and interactive experiences | +| [Unreal Engine 5](https://github.com/stereolabs/zed-UE5) | Incorporate ZED SDK real-time human body tracking data into Unreal Engine 5 seamlessly | +| [Unreal Engine 5 with Livelink](https://github.com/stereolabs/zed-UE5) | Real-time streaming of body tracking data directly into Unreal Engine 5 for animation and interactive experiences | diff --git a/camera control/README.md b/camera control/README.md index 3474e430..8ec62ebb 100644 --- a/camera control/README.md +++ b/camera control/README.md @@ -1,19 +1,25 @@ # ZED SDK - Camera Control -## This sample shows how to capture images with the ZED SDK and adjust camera settings. +This sample shows how to capture images with the ZED SDK video module and adjust camera settings. You can find additional information on the Video module in our [Documentation](https://www.stereolabs.com/docs/video/camera-controls/) and [API Reference](https://www.stereolabs.com/docs/api/group__Video__group.html). -### Features - - Camera images are displayed on an OpenCV windows +

+ +

+ + +## Overview + +This repository demonstrates how to change the **ZED camera's video settings** and display the camera's image using OpenCV windows. The sample provides examples of how to change the following settings: + +- Brightness +- Contrast +- Saturation +- Hue +- Sharpness +- Gamma +- Gain +- Exposure +- Automatic GAIN_EXP behavior +- White balance +- LED state -The following parameters can be changed: - - Brightness - - Contrast - - Saturation - - Hue - - Sharpness - - Gamma - - Gain - - Exposure - - Automatic GAIN_EXP behavior - - White Balance - - LED state diff --git a/camera control/csharp/CMakeLists.txt b/camera control/csharp/CMakeLists.txt index 367b3707..af33b6fd 100644 --- a/camera control/csharp/CMakeLists.txt +++ b/camera control/csharp/CMakeLists.txt @@ -29,6 +29,6 @@ set_property(TARGET ${PROJECT_NAME} PROPERTY VS_DOTNET_REFERENCES set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" "OpenCvSharp4.Windows_4.5.0.20201013" ) diff --git a/camera streaming/README.md b/camera streaming/README.md index c8a3546c..0018d8b1 100644 --- a/camera streaming/README.md +++ b/camera streaming/README.md @@ -1,4 +1,12 @@ -# ZED SDK - Streaming +# Streaming + +This sample shows how to create a **video stream** from a ZED camera which can be read by a remote application for viewing or further processing. You can find additional information on the Video Streaming module in our [Documentation](https://www.stereolabs.com/docs/video/streaming/) and [API Reference](https://www.stereolabs.com/docs/api/structsl_1_1StreamingParameters.html). + +## Overview + +This repository contains the following code samples: + +- [ZED Stream Sender](./sender): This sample demonstrates how to use the ZED SDK to establish a network connection, encode a live video stream, and transmit it to a remote client. The ZED SDK handles all aspects of network communication, video encoding, and transmission, making it easy to integrate the ZED camera into applications that require **remote video capture and processing**, such as computer vision, remote monitoring, or teleoperation. + +- [ZED Stream Receiver](./receiver/): This sample demonstrates how to use the ZED SDK to **receive and decode** video data sent from a remote ZED stream. -- **Sender**: physically open the camera and broadcasts its images on the network. -- **Reciever**: Connects to a broadcasting device to get the ZED images and process them. \ No newline at end of file diff --git a/depth sensing/README.md b/depth sensing/README.md index dfacffca..19a8b4a9 100644 --- a/depth sensing/README.md +++ b/depth sensing/README.md @@ -1 +1,17 @@ -# Depth Sensing Samples \ No newline at end of file +# Depth Sensing Samples + +This repository contains samples demonstrating how to use the [ZED]("https://www.stereolabs.com/store/") camera's **Depth Sensing features** using the ZED SDK. You can find additional information on the Depth Sensing module in our [Documentation](https://www.stereolabs.com/docs/depth-sensing/) and [API Reference](https://www.stereolabs.com/docs/api/group__Depth__group.html). + +

+ +

+ +## Overview + +This section contains the following code samples: + +- [Depth Sensing sample](./depth%20sensing): This sample demonstrates how to extract depth information from a single ZED camera and visualize it in an OpenGL window. +- [Depth Sensing with multiple cameras](./multi%20camera): This sample provides an example of how to design an app that use **multiple ZED cameras** in separated threads and displays RGB and depth maps in OpenCV. +- [Region Of Interest](./region%20of%20interest) This sample showcases how to define a Region of Interest (ROI), pixels outside of this area are discard from the all modules (depth, positional tracking, detections ...). +- [Image Refocus](./image%20refocus) This sample illustrates how to apply depth-dependent blur to an image, allowing users to adjust the focal point after the image has been taken. + diff --git a/depth sensing/depth sensing/csharp/CMakeLists.txt b/depth sensing/depth sensing/csharp/CMakeLists.txt index e53701a4..fa8667e5 100644 --- a/depth sensing/depth sensing/csharp/CMakeLists.txt +++ b/depth sensing/depth sensing/csharp/CMakeLists.txt @@ -40,7 +40,7 @@ set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES "OpenGL.Net_0.8.4" "OpenGL.Net.CoreUI_0.8.4" - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" ) \ No newline at end of file diff --git a/depth sensing/image refocus/cpp/src/main.cpp b/depth sensing/image refocus/cpp/src/main.cpp index 4c47e3a8..bd7b3238 100644 --- a/depth sensing/image refocus/cpp/src/main.cpp +++ b/depth sensing/image refocus/cpp/src/main.cpp @@ -140,10 +140,9 @@ int main(int argc, char **argv) { glewInit(); InitParameters init_parameters; - init_parameters.depth_mode = DEPTH_MODE::ULTRA; - init_parameters.camera_resolution = RESOLUTION::AUTO; + init_parameters.depth_mode = DEPTH_MODE::NEURAL; init_parameters.coordinate_units = UNIT::MILLIMETER; - init_parameters.depth_minimum_distance = 400.0f; + init_parameters.depth_minimum_distance = 100.; // Open the camera ERROR_CODE zed_open_state = zed.open(init_parameters); diff --git a/fusion/README.md b/fusion/README.md index ca29dbdc..f62b569e 100644 --- a/fusion/README.md +++ b/fusion/README.md @@ -1,7 +1,27 @@ # Fusion Samples -The fusion API is designed to aggregate the data coming from multiple camera to improve them. +The ZED SDK's Fusion API is designed to combine data from multiple cameras, resulting in higher quality data. The API can fuse data from several cameras to improve the accuracy and robustness of tracking systems. -____________ +For instance, the Fusion API can be used in outdoor robot tracking with GNSS to provide real-time fusion of the 3D position and orientation of the robot, even in challenging environments. Additionally, the API can be used with the ZED Camera's body tracking feature to fuse data from multiple cameras to track an entire space with much higher quality. This capability enables a range of applications that require accurate spatial tracking, such as robotics, autonomous vehicles, augmented reality, and virtual reality. -[Here](/body%20tracking/multi-camera/) , you can find a sample that shows how to combine multiple body detections from an array of camera to create a better representation. \ No newline at end of file + +## Overview + +This section lists the available modules available in the **Fusion API**. It provides a convenient way to discover and access additional resources related to the Fusion API, including examples, tutorials, and integrations with other software platforms. These resources can be used to further explore the capabilities of the Fusion API and to build more sophisticated applications that leverage the data fusion capabilities of the ZED Camera. + +## Body tracking + +

+ +

+ + +The [Multi camera Body Tracking sample](/body%20tracking/multi-camera/) demonstrates how to combine multiple body detections from an array of cameras to create a more accurate and robust representation of the detected bodies. By fusing data from multiple cameras, the sample can improve the accuracy and robustness of the body tracking system, especially in challenging environments with occlusions or complex motions. The sample showcases the capabilities of the ZED SDK's Fusion API and provides a starting point for building more sophisticated applications that require multi-camera body tracking. + +## GeoTracking + +

+ +

+ +The [GeoTracking sample](/geotracking/) demonstrates how to combine data from the ZED Camera and a Global Navigation Satellite System (GNSS) receiver for outdoor tracking applications. The sample showcases the Fusion API of the ZED SDK and provides an example of how to use it to integrate data from multiple sources, such as the camera and GNSS receiver. By fusing data from these sources, the sample can improve the accuracy and robustness of the tracking system, especially in challenging outdoor environments. The sample provides a starting point for building more sophisticated applications that require outdoor tracking with the ZED Camera and GNSS. diff --git a/geotracking/README.md b/geotracking/README.md index 1b23a90f..0e8c74a4 100644 --- a/geotracking/README.md +++ b/geotracking/README.md @@ -1,16 +1,20 @@ -# ZED SDK - Geotracking for global scale localization on real-world map +# Geotracking -## Sample Structure: -The samples provided with the geotracking API are organized as follows: +These samples show how to use the ZED SDK Geotracking module for **global scale localization on a real-world map**. -### Recording Sample -The Recording sample demonstrates how to record data from both a ZED camera and an external GNSS sensor. The recorded data is saved in an SVO file and a JSON file, respectively. This sample provides the necessary data for the Playback sample. +

+ +

+ +## Overview + +The samples provided using the Geotracking API are organized as follows: + +- [Live Geotracking](./live%20geotracking/) The Live Geotracking sample demonstrates how to use the Geotracking API using both the ZED camera and an external GNSS sensor. It displays the corrected positional tracking in the ZED reference frame on an OpenGL window and the geo-position on a real-world map in ZEDHub. + +- [Geotracking Data Recording](./recording/): The Recording sample demonstrates how to **record data** from both a ZED camera and an external GNSS sensor. The recorded data is saved in an SVO file and a JSON file, respectively. This sample provides the necessary data to be used by the Playback sample. + +- [Geotracking Data Playback](./playback/): The Playback sample shows how to use the geotracking API for global scale localization on a real-world map. It takes the data generated by the Recording sample and uses it to display geo-positions on a real-world map. -### Playback Sample -The Playback sample shows how to use the geotracking API for global scale localization on a real-world map. It takes the data generated by the Recording sample and uses it to display geo-positions on a real-world map. -### GeoTracking Sample -The GeoTracking sample demonstrates how to use the geotracking API for global scale localization on a real-world map using both the ZED camera and an external GNSS sensor. It displays the corrected positional tracking in the ZED reference frame on an OpenGL window. It also displays the geo-position on a real-world map on [ZED Hub](https://hub.stereolabs.com). -By utilizing these samples, developers can quickly and easily incorporate geotracking capabilities into their projects using the ZED SDK. -Developers can take advantage of the benefits of the ZED SDK to quickly and easily incorporate geotracking capabilities into their projects. diff --git a/geotracking/live geotracking/README.md b/geotracking/live geotracking/README.md index db71e2e6..25ce846f 100644 --- a/geotracking/live geotracking/README.md +++ b/geotracking/live geotracking/README.md @@ -1,33 +1,29 @@ -# ZED SDK - Live Geotracking for Global Scale Localization on Real-World Map +# Live Geotracking Sample ## Overview -This sample demonstrates how to use geotracking to achieve global scale localization on a real-world map using the ZED camera. The ZED SDK Geotracking sample fuses visual odometry from the ZED SDK with external GNSS data in real-time, making it a valuable resource for applications such as autonomous robotics and drone navigation. +This sample demonstrates how to use the ZED SDK Geotracking module to achieve **global scale localization** on a real-world map using the ZED camera. The ZED SDK Live Geotracking sample fuses visual odometry from the ZED SDK with external GNSS data in real-time, making it a valuable resource for applications such as autonomous robotics and drone navigation. -### Features +## Features - Displays the camera's path in an OpenGL window in 3D - Displays path data, including translation and rotation -- Displays the fused path on a map on ZedHub +- Displays the fused path on a map on ZED Hub - Exports KML files for the fused trajectory and raw GNSS data -### Dependencies +## Dependencies Before using this sample, ensure that you have the following dependencies installed on your system: -- ZEDHub edge-cli: required for displaying the computed trajectory on a real-world map. +- ZED Hub Edge Agent: to be able to display the computed trajectory on a real-world map, connect your device to [ZED Hub](https://hub.stereolabs.com/). Detailed tutorials can be found [here](https://www.stereolabs.com/docs/cloud/overview/setup-device/). - libgps-dev: required to use an external GNSS sensor. -### Installation and Usage +## Installation and Usage To use the ZED SDK Geotracking sample, follow these steps: -1. Download and install the ZED SDK on your system from the official Stereolabs website. -2. Install the ZEDHub edge-cli from the ZEDHub website and the libgps-dev dependency using your operating system's package manager. +1. Download and install the ZED SDK on your system from the official [Stereolabs website](https://www.stereolabs.com/developers/release/). +2. Install Edge Agent from [ZED Hub](https://hub.stereolabs.com/) and the libgps-dev dependency using your operating system's package manager. 3. Connect your ZED camera and GNSS sensor to your computer. 4. Open a terminal and navigate to the zed-geotracking sample directory. 5. Compile the sample. 6. Run the zed-geotracking executable. -7. The sample will display the camera's path and path data in a 3D window. The fused path will be displayed on a map on ZedHub, and KML files will be generated for the fused trajectory and raw GNSS data. - -### Support and Resources - -If you have any questions or encounter any issues while using the ZED SDK, please visit the official Stereolabs support forums. Here, you can find helpful resources such as tutorials, documentation, and a community of developers to assist you in troubleshooting any problems you may encounter. +7. The sample will display the camera's path and path data in a 3D window. The fused path will be displayed on ZED Hub's maps page, and KML files will be generated for the fused trajectory and raw GNSS data. \ No newline at end of file diff --git a/geotracking/live geotracking/cpp/include/display/GLViewer.hpp b/geotracking/live geotracking/cpp/include/display/GLViewer.hpp index 19283de9..6067666f 100644 --- a/geotracking/live geotracking/cpp/include/display/GLViewer.hpp +++ b/geotracking/live geotracking/cpp/include/display/GLViewer.hpp @@ -157,7 +157,7 @@ class GLViewer { void exit(); bool isAvailable(); void init(int argc, char **argv); - void updateData(sl::Transform zed_rt, std::string str_t, std::string str_r, sl::POSITIONAL_TRACKING_STATE state); + void updateData(sl::Transform zed_rt, sl::POSITIONAL_TRACKING_STATE state); private: // Rendering loop method called each frame by glutDisplayFunc diff --git a/geotracking/live geotracking/cpp/include/display/GenericDisplay.h b/geotracking/live geotracking/cpp/include/display/GenericDisplay.h index 4fc787c5..1ee28962 100644 --- a/geotracking/live geotracking/cpp/include/display/GenericDisplay.h +++ b/geotracking/live geotracking/cpp/include/display/GenericDisplay.h @@ -36,11 +36,9 @@ class GenericDisplay * @brief Update the OpenGL view with last pose data * * @param zed_rt last pose data - * @param str_t std::string that represents current translations - * @param str_r std::string that represents current rotations * @param state current tracking state */ - void updatePoseData(sl::Transform zed_rt, std::string str_t, std::string str_r, sl::POSITIONAL_TRACKING_STATE state); + void updatePoseData(sl::Transform zed_rt, sl::POSITIONAL_TRACKING_STATE state); /** * @brief Display current fused pose either in KML file or in ZEDHub depending compilation options * diff --git a/geotracking/live geotracking/cpp/src/display/GLViewer.cpp b/geotracking/live geotracking/cpp/src/display/GLViewer.cpp index c98a9ef2..02a8b07a 100644 --- a/geotracking/live geotracking/cpp/src/display/GLViewer.cpp +++ b/geotracking/live geotracking/cpp/src/display/GLViewer.cpp @@ -256,13 +256,19 @@ void GLViewer::draw() { glUseProgram(0); } -void GLViewer::updateData(sl::Transform zed_rt, std::string str_t, std::string str_r, sl::POSITIONAL_TRACKING_STATE state) { +void GLViewer::updateData(sl::Transform zed_rt, sl::POSITIONAL_TRACKING_STATE state) { mtx.lock(); vecPath.push_back(zed_rt.getTranslation()); zedModel.setRT(zed_rt); updateZEDposition = true; - txtT = str_t; - txtR = str_r; + + std::stringstream ss; + ss << zed_rt.getTranslation(); + txtT = ss.str(); + ss.clear(); + ss << zed_rt.getEulerAngles(); + txtR = ss.str(); + trackState = state; mtx.unlock(); } diff --git a/geotracking/live geotracking/cpp/src/display/GenericDisplay.cpp b/geotracking/live geotracking/cpp/src/display/GenericDisplay.cpp index 285e4a2b..93f47e39 100644 --- a/geotracking/live geotracking/cpp/src/display/GenericDisplay.cpp +++ b/geotracking/live geotracking/cpp/src/display/GenericDisplay.cpp @@ -34,9 +34,9 @@ void GenericDisplay::init(int argc, char **argv) #endif } -void GenericDisplay::updatePoseData(sl::Transform zed_rt, std::string str_t, std::string str_r, sl::POSITIONAL_TRACKING_STATE state) +void GenericDisplay::updatePoseData(sl::Transform zed_rt, sl::POSITIONAL_TRACKING_STATE state) { - opengl_viewer.updateData(zed_rt, str_t, str_r, state); + opengl_viewer.updateData(zed_rt, state); } bool GenericDisplay::isAvailable(){ diff --git a/geotracking/live geotracking/cpp/src/main.cpp b/geotracking/live geotracking/cpp/src/main.cpp index 1ce8e885..8e103c1e 100644 --- a/geotracking/live geotracking/cpp/src/main.cpp +++ b/geotracking/live geotracking/cpp/src/main.cpp @@ -97,17 +97,10 @@ int main(int argc, char **argv) sl::Pose fused_position; // Get position into the ZED CAMERA coordinate system: sl::POSITIONAL_TRACKING_STATE current_state = fusion.getPosition(fused_position); - if (current_state == sl::POSITIONAL_TRACKING_STATE::OK) - { - std::stringstream ss; - ss << fused_position.pose_data.getTranslation(); - std::string translation_message = ss.str(); - ss.clear(); - ss << fused_position.pose_data.getEulerAngles(); - std::string rotation_message = ss.str(); - // Display it on OpenGL: - viewer.updatePoseData(fused_position.pose_data, translation_message, rotation_message, current_state); - } + + // Display it on OpenGL: + viewer.updatePoseData(fused_position.pose_data, current_state); + // Get position into the GNSS coordinate system - this needs a initialization between CAMERA // and GNSS. When the initialization is finish the getGeoPose will return sl::POSITIONAL_TRACKING_STATE::OK sl::GeoPose current_geopose; diff --git a/geotracking/playback/README.md b/geotracking/playback/README.md index 561888d1..761bf5fb 100644 --- a/geotracking/playback/README.md +++ b/geotracking/playback/README.md @@ -1,32 +1,28 @@ -# ZED SDK - Geotracking Playback for Global Scale Localization on Real-World Map +# Geotracking Data Playback ## Overview The ZED SDK Geotracking Playback sample demonstrates how to fuse pre-recorded GNSS data (saved in a JSON file) and pre-recorded camera data (saved into an SVO file) for achieving global scale localization on a real-world map. This sample is useful for applications such as offline analysis of sensor data or simulation / testing. -### Features +## Features - Displays the camera's path in an OpenGL window. - Displays path data, including translation and rotation. - Displays the fused path on a map on ZedHub. - Exports KML files for the fused trajectory and raw GNSS data. -### Dependencies +## Dependencies Before using this sample, ensure that you have the following dependencies installed on your system: - ZED SDK: download and install from the official Stereolabs website (https://www.stereolabs.com/developers/release/). -- ZEDHub edge-cli: required for displaying the computed trajectory on a real-world map. +- ZED Hub Edge Agent: to be able to display the computed trajectory on a real-world map, connect your device to [ZED Hub](https://hub.stereolabs.com/). Detailed tutorials can be found [here](https://www.stereolabs.com/docs/cloud/overview/setup-device/). -### Installation and Usage +## Installation and Usage To use the ZED SDK Geotracking Playback sample, follow these steps: 1. Download and install the ZED SDK on your system from the official Stereolabs website (https://www.stereolabs.com/developers/release/). -2. Install the ZEDHub edge-cli from the ZEDHub website. +2. Install the ZEDub edge-cli from the ZEDHub website. 3. Open a terminal and navigate to the zed-geotracking-playback sample directory. 4. Compile it. 5. Run the zed-geotracking-playback executable, passing the path to the SVO file as the first input argument of the command line and the path to gnss file as second argument. -6. The sample will playback the SVO file and display the camera's path and path data in a 3D window. The fused path will be displayed on a map on ZedHub, and KML files will be generated for the fused trajectory and raw GNSS data. - -### Support and Resources - -If you have any questions or encounter any issues while using the ZED SDK, please visit the official Stereolabs support forums. Here, you can find helpful resources such as tutorials, documentation, and a community of developers to assist you in troubleshooting any problems you may encounter. +6. The sample will playback the SVO file and display the camera's path and path data in a 3D window. The fused path will be displayed on a map on ZedHub, and KML files will be generated for the fused trajectory and raw GNSS data. \ No newline at end of file diff --git a/geotracking/recording/README.md b/geotracking/recording/README.md index 35231ff0..34c50e45 100644 --- a/geotracking/recording/README.md +++ b/geotracking/recording/README.md @@ -1,4 +1,4 @@ -# ZED SDK - Geotracking Data Recording Sample +# Geotracking Data Recording Sample ## Overview The Geotracking Data Recording sample demonstrates how to record data for geotracking localization on real-world maps using the ZED camera. The sample generates data in the form of an SVO file, which contains camera data, and a JSON file, which contains pre-recorded GNSS data for use in the playback sample. This sample is a useful resource for developers working on autonomous driving, robotics, and drone navigation applications. @@ -26,9 +26,4 @@ To use the Geotracking Data Recording sample, follow these steps: 4. Open a terminal and navigate to the Geotracking Data Recording sample directory. 5. Compile the sample. 6. Run the Geotracking Data Recording executable. -7. The sample will display the camera's path and path data in a 3D window. KML files will be generated for displaying the raw GNSS data and fused position on a real-world map like google maps after capture. Additionally, an SVO file corresponding to camera data and a JSON file corresponding to recorded GNSS data will be generated. - -## Support and Resources - -If you have any questions or encounter any issues while using the ZED SDK, please visit the official Stereolabs support forums. Here, you can find helpful resources such as tutorials, documentation, and a community of developers to assist you in troubleshooting any problems you may encounter. - +7. The sample will display the camera's path and path data in a 3D window. KML files will be generated for displaying the raw GNSS data and fused position on a real-world map like google maps after capture. Additionally, an SVO file corresponding to camera data and a JSON file corresponding to recorded GNSS data will be generated. \ No newline at end of file diff --git a/object detection/README.md b/object detection/README.md index b3f2f77f..13e1a3ce 100644 --- a/object detection/README.md +++ b/object detection/README.md @@ -1,5 +1,19 @@ -# ZED SDK - Object Detection +# Object Detection - - **Birds eye viewer**: Deteted objects are displayed in a **3D** view with the current point cloud. - - **Image viewer**: Detected objects are displayed in a **2D** view with the current image. - - **Custom detector**: Sample with external detectors to use the DETECTION_MODEL::CUSTOM_BOX_OBJECTS mode \ No newline at end of file +These samples show how to use ZED SDK for performing object detection. You can find additional information on the Object Detection module in our [Documentation](https://www.stereolabs.com/docs/object-detection/) and [API Reference](https://www.stereolabs.com/docs/api/group__Object__group.html). + +

+ +

+ +## Overview + +This section contains the following code samples: + + - [Birds Eye Viewer](./birds%20eye%20viewer/) Detected objects are presented in a top-down view alongside the 3D point cloud, providing an intuitive perspective on object placement. + + - [Concurrent Detections](./concurrent%20detections/): This sample demonstrates how to simultaneously run **Object detection** and **Body Tracking** modules, which allow to use of both detectors in a single application. + + - [Image Viewer](./image%20viewer/): Detected objects are displayed in a 2D view making it simple to identify and track objects of interest. + + - [Custom Detector](./custom%20detector/): This sample shows how to use a custom object detector on ZED images. The ZED SDK then computes 3D information and performs object tracking on detected objects. diff --git a/object detection/birds eye viewer/csharp/CMakeLists.txt b/object detection/birds eye viewer/csharp/CMakeLists.txt index 047cf4d2..a5ec3fd7 100644 --- a/object detection/birds eye viewer/csharp/CMakeLists.txt +++ b/object detection/birds eye viewer/csharp/CMakeLists.txt @@ -45,7 +45,7 @@ set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES "OpenGL.Net_0.8.4" "OpenGL.Net.CoreUI_0.8.4" "OpenCvSharp4.Windows_4.5.0.20201013" - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" ) \ No newline at end of file diff --git a/object detection/custom detector/cpp/opencv_dnn_yolov4/src/main.cpp b/object detection/custom detector/cpp/opencv_dnn_yolov4/src/main.cpp index 1d4b0c13..f66c266e 100644 --- a/object detection/custom detector/cpp/opencv_dnn_yolov4/src/main.cpp +++ b/object detection/custom detector/cpp/opencv_dnn_yolov4/src/main.cpp @@ -71,7 +71,6 @@ int main(int argc, char** argv) { /// Opening the ZED camera before the model deserialization to avoid cuda context issue sl::Camera zed; sl::InitParameters init_parameters; - init_parameters.camera_resolution = sl::RESOLUTION::HD1080; init_parameters.depth_mode = sl::DEPTH_MODE::ULTRA; init_parameters.coordinate_system = sl::COORDINATE_SYSTEM::RIGHT_HANDED_Y_UP; // OpenGL's coordinate system is right_handed @@ -91,7 +90,7 @@ int main(int argc, char** argv) { sl::ObjectDetectionParameters detection_parameters; detection_parameters.enable_tracking = true; // Let's define the model as custom box object to specify that the inference is done externally - detection_parameters.detection_model = sl::DETECTION_MODEL::CUSTOM_BOX_OBJECTS; + detection_parameters.detection_model = sl::OBJECT_DETECTION_MODEL::CUSTOM_BOX_OBJECTS; returned_state = zed.enableObjectDetection(detection_parameters); if (returned_state != sl::ERROR_CODE::SUCCESS) { print("enableObjectDetection", returned_state, "\nExit program."); diff --git a/object detection/custom detector/cpp/tensorrt_yolov5_v5.0/src/yolov5.cpp b/object detection/custom detector/cpp/tensorrt_yolov5_v5.0/src/yolov5.cpp index a4d830cf..2871fc42 100644 --- a/object detection/custom detector/cpp/tensorrt_yolov5_v5.0/src/yolov5.cpp +++ b/object detection/custom detector/cpp/tensorrt_yolov5_v5.0/src/yolov5.cpp @@ -358,7 +358,6 @@ int main(int argc, char** argv) { /// Opening the ZED camera before the model deserialization to avoid cuda context issue sl::Camera zed; sl::InitParameters init_parameters; - init_parameters.camera_resolution = sl::RESOLUTION::HD1080; init_parameters.sdk_verbose = true; init_parameters.depth_mode = sl::DEPTH_MODE::ULTRA; init_parameters.coordinate_system = sl::COORDINATE_SYSTEM::RIGHT_HANDED_Y_UP; // OpenGL's coordinate system is right_handed @@ -378,8 +377,8 @@ int main(int argc, char** argv) { // Custom OD sl::ObjectDetectionParameters detection_parameters; detection_parameters.enable_tracking = true; - detection_parameters.enable_mask_output = false; // designed to give person pixel mask - detection_parameters.detection_model = sl::DETECTION_MODEL::CUSTOM_BOX_OBJECTS; + detection_parameters.enable_segmentation = false; // designed to give person pixel mask + detection_parameters.detection_model = sl::OBJECT_DETECTION_MODEL::CUSTOM_BOX_OBJECTS; returned_state = zed.enableObjectDetection(detection_parameters); if (returned_state != sl::ERROR_CODE::SUCCESS) { print("enableObjectDetection", returned_state, "\nExit program."); diff --git a/object detection/custom detector/cpp/tensorrt_yolov5_v6.0/src/yolov5.cpp b/object detection/custom detector/cpp/tensorrt_yolov5_v6.0/src/yolov5.cpp index c42d00e8..874e2812 100644 --- a/object detection/custom detector/cpp/tensorrt_yolov5_v6.0/src/yolov5.cpp +++ b/object detection/custom detector/cpp/tensorrt_yolov5_v6.0/src/yolov5.cpp @@ -364,7 +364,6 @@ int main(int argc, char** argv) { /// Opening the ZED camera before the model deserialization to avoid cuda context issue sl::Camera zed; sl::InitParameters init_parameters; - init_parameters.camera_resolution = sl::RESOLUTION::HD1080; init_parameters.sdk_verbose = true; init_parameters.depth_mode = sl::DEPTH_MODE::ULTRA; init_parameters.coordinate_system = sl::COORDINATE_SYSTEM::RIGHT_HANDED_Y_UP; // OpenGL's coordinate system is right_handed @@ -384,8 +383,8 @@ int main(int argc, char** argv) { // Custom OD sl::ObjectDetectionParameters detection_parameters; detection_parameters.enable_tracking = true; - detection_parameters.enable_mask_output = false; // designed to give person pixel mask - detection_parameters.detection_model = sl::DETECTION_MODEL::CUSTOM_BOX_OBJECTS; + detection_parameters.enable_segmentation = false; // designed to give person pixel mask + detection_parameters.detection_model = sl::OBJECT_DETECTION_MODEL::CUSTOM_BOX_OBJECTS; returned_state = zed.enableObjectDetection(detection_parameters); if (returned_state != sl::ERROR_CODE::SUCCESS) { print("enableObjectDetection", returned_state, "\nExit program."); diff --git a/object detection/image viewer/csharp/CMakeLists.txt b/object detection/image viewer/csharp/CMakeLists.txt index 9e8c3f7e..38f84baf 100644 --- a/object detection/image viewer/csharp/CMakeLists.txt +++ b/object detection/image viewer/csharp/CMakeLists.txt @@ -41,6 +41,6 @@ set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES "OpenGL.Net_0.8.4" "OpenGL.Net.CoreUI_0.8.4" - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" ) \ No newline at end of file diff --git a/plane detection/README.md b/plane detection/README.md index badf578c..3e2ac351 100644 --- a/plane detection/README.md +++ b/plane detection/README.md @@ -1 +1,16 @@ -# Plane Detection Samples \ No newline at end of file +# Plane Detection Samples + +These samples show how to use ZED SDK for **plane detection**. You can find additional information on the Plane Detection features in our [Documentation](https://www.stereolabs.com/docs/spatial-mapping/plane-detection/). + +

+ +

+ + +## Overview + +This section contains the following code samples: + + - [Floor Plane Sample](./floor%20plane/): This sample shows two ways to automatically detect the floor plane and set it as origin of a virtual view. + + - [Plane Detection Sample](./plane%20detection/): This sample shows how to perform plane detection with a ZED Camera. You can use either the plane at hit functionality or the automatic floor plane detection. diff --git a/plane detection/plane detection/csharp/CMakeLists.txt b/plane detection/plane detection/csharp/CMakeLists.txt index 99299a47..14d07647 100644 --- a/plane detection/plane detection/csharp/CMakeLists.txt +++ b/plane detection/plane detection/csharp/CMakeLists.txt @@ -41,5 +41,5 @@ set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES "OpenGL.Net_0.8.4" "OpenGL.Net.CoreUI_0.8.4" - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" ) \ No newline at end of file diff --git a/positional tracking/README.md b/positional tracking/README.md index ed265356..0689c24e 100644 --- a/positional tracking/README.md +++ b/positional tracking/README.md @@ -1 +1,16 @@ -# Positional Tracking Samples \ No newline at end of file +# Positional tracking + +These samples show how to use ZED SDK for performing positional tracking. You can find additional information on the Positional Tracking module in our [Documentation](https://www.stereolabs.com/docs/positional-tracking/) and [API Reference](https://www.stereolabs.com/docs/api/group__PositionalTracking__group.html). + +

+ +

+ + +## Overview + +This section contains the following code samples: + +- [FBX Export](./export/) This sample shows how to **export tracking data** generated by the ZED SDK tracking module, in the **FBX Format**. + + - [Positional Tracking](./positional%20tracking/) This sample shows how to use the ZED SDK's powerful **Positional Tracking module**. With this sample, users can gain a solid understanding of how to leverage the ZED stereo camera for accurate and reliable tracking of the device's position and orientation in 3D space. diff --git a/positional tracking/positional tracking/csharp/CMakeLists.txt b/positional tracking/positional tracking/csharp/CMakeLists.txt index 42feb822..08686427 100644 --- a/positional tracking/positional tracking/csharp/CMakeLists.txt +++ b/positional tracking/positional tracking/csharp/CMakeLists.txt @@ -42,5 +42,5 @@ set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES "OpenGL.Net_0.8.4" "OpenGL.Net.CoreUI_0.8.4" - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" ) \ No newline at end of file diff --git a/recording/README.md b/recording/README.md index da76765f..13e1964f 100644 --- a/recording/README.md +++ b/recording/README.md @@ -1,7 +1,13 @@ -# ZED SDK - SVO -- **Recording**: Shows how to record a svo to be played later with the ZED SDK. +# SVO -- **Playback**: Shows how to read a recorded '.svo' file and how it can be controlled. +These samples show how to use ZED SDK for recording and playback SVO files. SVO files are a proprietary format developed by Stereolabs, designed to facilitate the storage and replay of data captured by the ZED camera. You can find additional information on the **video recording features** in our [Documentation](https://www.stereolabs.com/docs/video/recording/). + +## Overview + +This section contains the following code samples: + +- [SVO Recording](./recording/): Shows how to **record** a svo to be played later with the ZED SDK. +- [SVO Playback](./playback/): Shows how to **read** a recorded '.svo' file and how it can be controlled. +- [SVO Export](./export/): Contains a list of samples that allow you to **export** ZED SDK data in different common file formats. -- **Export**: Shows how to read a recorded '.svo' file to exports its data into different common formats. \ No newline at end of file diff --git a/recording/export/svo/csharp/CMakeLists.txt b/recording/export/svo/csharp/CMakeLists.txt index 6bba660c..96075008 100644 --- a/recording/export/svo/csharp/CMakeLists.txt +++ b/recording/export/svo/csharp/CMakeLists.txt @@ -30,6 +30,6 @@ set_property(TARGET ${PROJECT_NAME} PROPERTY VS_DOTNET_REFERENCES set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" "OpenCvSharp4.Windows_4.5.0.20201013" ) \ No newline at end of file diff --git a/recording/playback/csharp/CMakeLists.txt b/recording/playback/csharp/CMakeLists.txt index c7e7d055..a9eb2946 100644 --- a/recording/playback/csharp/CMakeLists.txt +++ b/recording/playback/csharp/CMakeLists.txt @@ -29,6 +29,6 @@ set_property(TARGET ${PROJECT_NAME} PROPERTY VS_DOTNET_REFERENCES set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" "OpenCvSharp4.Windows_4.5.0.20201013" ) diff --git a/recording/recording/mono/csharp/CMakeLists.txt b/recording/recording/mono/csharp/CMakeLists.txt index 3be7d198..06594b4f 100644 --- a/recording/recording/mono/csharp/CMakeLists.txt +++ b/recording/recording/mono/csharp/CMakeLists.txt @@ -30,5 +30,5 @@ set_property(TARGET ${PROJECT_NAME} PROPERTY VS_DOTNET_REFERENCES set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" ) \ No newline at end of file diff --git a/spatial mapping/README.md b/spatial mapping/README.md index b0885793..8fffc86c 100644 --- a/spatial mapping/README.md +++ b/spatial mapping/README.md @@ -1,4 +1,15 @@ -# ZED SDK - Spatial Mapping +# Spatial Mapping -- **Basic**: A full presentation of the spatial mapping module, can create Mesh or Fused Point Cloud, all parameters are exposed. Mesh is overlaid to the current image. -- **advanced point cloud mapping**: Perform only Fused Point Cloud mapping, displays the current result in a 3D OpenGL window. +These samples provide a comprehensive overview of how to leverage the ZED SDK for effective spatial mapping using a single ZED camera. With these samples, users can gain a solid understanding of how to utilize the ZED stereo camera to create detailed 3D maps of indoor and outdoor environments, all without requiring the use of additional sensors or equipment. You can find additional information on the Spatial Mapping module in our [Documentation](https://www.stereolabs.com/docs/spatial-mapping/) and [API Reference](https://www.stereolabs.com/docs/api/group__SpatialMapping__group.html). + +

+ +

+ +## Overview + +This section contains the following code samples: + +- [Basic Spatial Mapping Sample](./spatial%20mapping/): A full presentation of the spatial mapping module, which can create Meshes or Fused Point Clouds, all parameters are exposed. Mesh is overlaid to the current image. + +- [Advanced point cloud mapping](./advanced%20point%20cloud%20mapping/) Perform only Fused Point Cloud mapping, displays the current result in a 3D OpenGL window. diff --git a/spatial mapping/spatial mapping/csharp/CMakeLists.txt b/spatial mapping/spatial mapping/csharp/CMakeLists.txt index 928d41a5..55725b98 100644 --- a/spatial mapping/spatial mapping/csharp/CMakeLists.txt +++ b/spatial mapping/spatial mapping/csharp/CMakeLists.txt @@ -40,5 +40,5 @@ set(CMAKE_SUPPRESS_REGENERATION true) set_property(TARGET ${PROJECT_NAME} PROPERTY VS_PACKAGE_REFERENCES "OpenGL.Net_0.8.4" "OpenGL.Net.CoreUI_0.8.4" - "Stereolabs.zed_3.7.*" + "Stereolabs.zed_4.*" ) \ No newline at end of file