A real-time computer vision project that allows users to control music playback and audio parameters (such as volume, pitch and speed) using hand gestures. This hands-free interface leverages hand tracking and gesture recognition to create an intuitive, touchless musical experience.
- 🖐️ Real-time hand detection and gesture recognition
- ✋ Left and Right hand classification
- 🤲 Multi-hand support (several hands can be detected independently)
- 🎚️ Control music volume, pitch, speed and play/pause with intuitive hand gestures
- 🎶 Real-time audio processing
- 🎯 Visual cursors and indicators
- ⚡Smooth and real-time performance via multithreading
- Python 3.10+
- OpenCV
- Mediapipe
- Numpy
- Soundfile
- Sounddevice
- Rubberband
- Python 3.10+
- pyrubberband requires the system-level Rubber Band Library to be installed
- Webcam
- A decent CPU for real-time performance
- Make sure your camera has a good view of your hand.
- Good lighting improves hand detection accuracy.
- If only one hand is detected , it's the audio controller by default. It's the right hand by default when both hand are detected. the left hand is used as a media player.
- Clone or download the repository
git clone https://github.com/Alicia105/AIRMusicController.git
- Navigate to the project directory
cd AIRMusicController
- Install needed dependencies
pip install -r requirements.txt
- Navigate to the project source code directory
cd AIRMusicController/src
- Launch the app
python main.py