Skip to content

Eshita-Badhe/Sign-Language-Interpreter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🤟 Sign Language Interpreter

Sign Language Interpreter is an AI-powered system that translates hand gestures into natural language text and speech. Using MediaPipe for hand detection, KNN for gesture classification, and Gemini AI for structuring sentences, it provides a user-friendly interface with Tkinter. The app supports webcam and file upload inputs, making sign language communication more accessible and interactive.



Example Interface

Sign Language Interpreter GUI

Features

  • Real-time Gesture Detection: Uses MediaPipe to detect hand landmarks live from webcam or uploaded videos/images.
  • Accurate Gesture Classification: KNN model trained on hand gesture images to recognize alphabets (A-Z).
  • Prediction Buffering: Stabilizes predictions using a 15-frame buffer to avoid jittery results.
  • Natural Language Structuring: Sends recognized letters to Gemini AI to generate meaningful sentences.
  • GUI Display & Text-to-Speech: Shows recognized sentences on Tkinter window and reads them aloud.
  • Multiple Input Modes: Supports both webcam live feed and uploaded media files.

Technologies Used

  • Frontend: Tkinter (Python GUI)
  • Backend: Python, OpenCV
  • Hand Detection: MediaPipe
  • Gesture Classification: scikit-learn (KNN)
  • Language Processing: Gemini AI
  • Text-to-Speech: pyttsx3

Installation

Prerequisites

  • Python 3.8 to 3.10(any) installed
  • Basic command line knowledge

Steps

  1. Clone the repository:
    git clone https://github.com/Eshita-Badhe/Sign-Language-Interpreter.git
  2. Navigate to the project directory:
    cd Sign-Language-Interpreter
  3. Install required packages:
    pip install -r requirements.txt
  4. Run the application:
    python app.py

Usage

  1. Launch the app.
  2. Select input mode: Webcam or Upload a media file.
  3. Start making hand gestures representing alphabets and click 'q' when completed.
  4. Wait as the app detects, predicts letters, buffers results, and forms sentences.
  5. View the recognized sentence on the GUI and listen to the audio output.

Example

Sign Language Interpreter Example

Contributing

Contributions are welcome! Feel free to fork the repo, create branches for your features, and submit pull requests with clear explanations.

Future Improvements

  • Replace KNN with CNN or other deep learning models for better accuracy.
  • Implement hand cropping and preprocessing to improve gesture focus.
  • Expand vocabulary to include numbers, special signs, and phrases.
  • Add GUI features like sentence editing and correction.
  • Enable multilingual support using Gemini AI capabilities.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

For feedback or collaboration, reach out to:

About

Sign Language Interpreter is an AI-powered system that translates hand gestures into natural language text and speech.

Topics

Resources

License

Stars

Watchers

Forks

Contributors

Languages