An Interactive (Proof of Concept) Web Demo showcasing Touchless Interactions using the MediaPipe Machine Learning Library.
This project aims to test and demonstrate the capabilities of the new MediaPipe Hand Landmarker task from MediaPipe Solutions, which outputs 21 hand landmarks. The task provides precise and accurate hand landmark detection, generating 21 key points on the hand. These landmarks are utilized in this interactive web app which enables users to perform contactless interactions with the interface using simple human gestures. Best experienced in well-lit environments. Ideal on larger screens.
ⓘ All data taken via input video feed is deleted after returning inference and is computed directly on the client side, making it GDPR compliant.
This project requires Node.js to be installed on your local machine.
⚠️ Webcam required for hand detection and gesture recognition. Please ensure your device has a functioning webcam.
-
Clone the repository on your local machine:
git clone https://github.com/googlesamples/mediapipe.git
-
Navigate into the project directory:
cd tutorials/atm_playground
-
Install the necessary dependencies:
npm install
-
Start the development server:
npm start
-
Open the project in your browser at
http://localhost:3000
to view your project.
🚀 View a live demo in your browser here.
This project was created using:
- React
- Tailwind CSS
- MediaPipe Hand Landmarker
- Redux
- React Redux
- PostCSS
- React Toastify
- React Confetti
- Figma
Precaching is enabled! Hence, the app will work offline after the first load. 🎉
Distributed under the Apache License 2.0. See LICENSE
for more information.