This project is a Python implementation of a pose graph neural network for hand gesture recognition using the MediaPipe Hand Landmarks model and GNN network.
- Clone this repository:
git clone https://github.com/RenzoTsai/PoseGNN.git
- Navigate to the project directory:
cd PoseGNN
- Create a virtual environment:
python3 -m venv venv
- Activate the virtual environment:
source venv/bin/activate
- Install the required packages:
pip install -r requirements.txt
Run the main jupyter file: jupyter notebook main.ipynb
. The notebook contains the code for training and testing the model.
The dataset used in this project is the ASL Alphabet dataset. The dataset consists of images of people making sign language gestures for the 26 letters of the alphabet and 0-9 numbers.