- Sweden, Colombia
- www.luisqtr.com
- @luisqtr0
Lists (6)
Sort Name ascending (A-Z)
👓 Applications
Open-source interesting apps ready to test (Unity or Python)📚 Datasets
Open datasets for future research⚙ Frameworks
Software frameworks for Unity, Python, ML, or VR🌟 Inspiration/Random
🧻 Papers
Scientific papers📂 Research Projects
Stars
Visualize streams of multimodal data. Free, fast, easy to use, and simple to integrate. Built in Rust.
The Locomotion Evaluation Testbed VR (or LET-VR) is a tool which helps to select the best suitable locomotion technique to adopt in a given VR application scenario.
Statistical package in Python based on Pandas
VR Builder lets you create better VR experiences faster. Our code is open source and designed to be extended. Plus, our GUI empowers everyone to create fully functional VR apps - without writing code.
This tool finds unreferenced assets by scanning all files in your Unity project.
Ocean is the in-house framework for Computer Vision (CV) and Augmented Reality (AR) applications at Meta. It is platform independent and is mainly implemented in C/C++.
PLUME is a software toolbox that allows for the exhaustive record of XR behavioral data (including synchronous physiological signals), their offline interactive replay and analysis.
Open-source scientific and technical publishing system built on Pandoc.
Unity FreeD allows you to move a GameObject based on incoming data in the FreeD format.
The Virtual Reality Scientific Toolkit facilitates the creation and execution of experiments in VR environments by making object tracking and data collection easier.
A modular high-level library to train embodied AI agents across a variety of tasks and environments.
A simple Sphinx extension to include information about multiple documentation versions.
A integration approach of the LabStreamingLayer Framework for Unity3D
Phanto is a showcase of the Meta Quest Mixed Reality APIs. This project demonstrate how to use Meshes.
Unique identification of 50,000+ virtual reality users from their head and hand motion data
Source code for "Is that my Heartbeat? Measuring and Understanding Modality-dependent Cardiac Interoception in Virtual Reality" in IEEE TVCG (ISMAR 2023)
Exploring how to infer VR users' affective states from their EEG activity in real-time. We used supervised learning and conducted the experiment in Virtual Reality. Two feature selection methods ar…
This dataset contains over 110 hours of motion, eye-tracking and physiological data from 71 players of the virtual reality game “Half-Life: Alyx”. Each player played the game on two separate days f…
Exploring unprecedented avenues for data harvesting in the metaverse
[WACV 2024] LibreFace: An Open-Source Toolkit for Deep Facial Expression Analysis
PhysioKit: Open-source, accessible Physiological Computing Toolkit [Sensors 2023]
Scripts repository for analysis of DRAP database
WebXR immersive gardening experience.
Oculus Interaction SDK showcase demonstrating the use of Interaction SDK in Unity with hand tracking. This project contains the interactions used in the "First Hand" demo available on App Lab. The …