PiGaze is a lightweight, real-time eye movement tracking system using a Raspberry Pi, Pi Camera, and PyTorch-based deep learning models. There are 2 models that employs CNN and FCNN respectively for accurate gaze estimation and direction prediction, making it suitable for various interactive applications.
- Real-Time Gaze Tracking: Tracks eye movement in real-time.
- Direction Mapping: Detects gaze directions (Left, Right, Up, Down, Center).
- Lightweight Models: Optimized for Raspberry Pi's limited computational resources.
- Video Recording: Saves live-feed eye tracking output in video format.
- Raspberry Pi 3/4
- Pi Camera Module
- Raspberry Pi OS Bullseye or later, either 32-bit or 64-bit
- Python 3.7+
- PyTorch, OpenCV, Dlib, Picamera2, Numpy
PiGaze uses the MPIIFaceGaze Dataset for training and evaluation. This dataset provides extensive facial images annotated with gaze information, making it suitable for robust gaze estimation models.
- Dataset Name: MPIIFaceGaze
- Annotations: Includes gaze directions and facial landmarks for multiple participants.
- Applications: Gaze estimation, eye-tracking, and head-pose analysis.
- Visit the MPIIFaceGaze Dataset page.
- Click "Access Dataset" and then "Download ZIP."
- Accept the dataset terms and conditions, then download the file.
- Extract the dataset into the
data/directory of this repository.
- Facial Landmarks Model:
Downloadshape_predictor_68_face_landmarks.datfrom this GitHub link and place it in the working directory.
- Ensure dependencies have been downloaded properly on a suitable computer
- Ensure the dataset has been properly downloaded
- Run the MPII.py file to train the model and generate a .pth file (Can skip if using a pretrained model just ensure you have the .pth file in the directory with the MPII.py file)
- Ensure dependencies have been downloaded properly on Raspberry Pi 3.0 and that a camera is plugged in
- Transfer the .pth and pi.py files to the Raspberry Pi
- Ensure the correct model architecture (same as used for training) is being used in pi.py
- Run pi.py for real time gaze tracking predictions!
- Ensure dependencies have been downloaded properly on a suitable computer
- Ensure the dataset has been properly downloaded
- Run the PiGaze_model.py file to train the model and generate a .pth file (Can skip if using a pretrained model just ensure you have the .pth file in the directory with the MPII.py file)
- Ensure the correct model is being used in PiGaze_model_test.py (same as used for training)
- Run the PiGaze_model_test.py file to perform gaze predictions!