Skip to content

CR1502/PiGaze

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PiGaze: Real-Time Eye Movement Tracking with Raspberry Pi

PiGaze is a lightweight, real-time eye movement tracking system using a Raspberry Pi, Pi Camera, and PyTorch-based deep learning models. There are 2 models that employs CNN and FCNN respectively for accurate gaze estimation and direction prediction, making it suitable for various interactive applications.


Features

  • Real-Time Gaze Tracking: Tracks eye movement in real-time.
  • Direction Mapping: Detects gaze directions (Left, Right, Up, Down, Center).
  • Lightweight Models: Optimized for Raspberry Pi's limited computational resources.
  • Video Recording: Saves live-feed eye tracking output in video format.

Requirements

Hardware

  • Raspberry Pi 3/4
  • Pi Camera Module

Software

  • Raspberry Pi OS Bullseye or later, either 32-bit or 64-bit
  • Python 3.7+
  • PyTorch, OpenCV, Dlib, Picamera2, Numpy

Dataset

PiGaze uses the MPIIFaceGaze Dataset for training and evaluation. This dataset provides extensive facial images annotated with gaze information, making it suitable for robust gaze estimation models.

Dataset Overview

  • Dataset Name: MPIIFaceGaze
  • Annotations: Includes gaze directions and facial landmarks for multiple participants.
  • Applications: Gaze estimation, eye-tracking, and head-pose analysis.

Download Instructions:

  1. Visit the MPIIFaceGaze Dataset page.
  2. Click "Access Dataset" and then "Download ZIP."
  3. Accept the dataset terms and conditions, then download the file.
  4. Extract the dataset into the data/ directory of this repository.

Additional Required Files:

  • Facial Landmarks Model:
    Download shape_predictor_68_face_landmarks.dat from this GitHub link and place it in the working directory.

Running The CNN Model

  1. Ensure dependencies have been downloaded properly on a suitable computer
  2. Ensure the dataset has been properly downloaded
  3. Run the MPII.py file to train the model and generate a .pth file (Can skip if using a pretrained model just ensure you have the .pth file in the directory with the MPII.py file)
  4. Ensure dependencies have been downloaded properly on Raspberry Pi 3.0 and that a camera is plugged in
  5. Transfer the .pth and pi.py files to the Raspberry Pi
  6. Ensure the correct model architecture (same as used for training) is being used in pi.py
  7. Run pi.py for real time gaze tracking predictions!

Running The FCNN Model

  1. Ensure dependencies have been downloaded properly on a suitable computer
  2. Ensure the dataset has been properly downloaded
  3. Run the PiGaze_model.py file to train the model and generate a .pth file (Can skip if using a pretrained model just ensure you have the .pth file in the directory with the MPII.py file)
  4. Ensure the correct model is being used in PiGaze_model_test.py (same as used for training)
  5. Run the PiGaze_model_test.py file to perform gaze predictions!

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages