You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
FlexLingo is a final year IoT project that serves as a flex sensor-based sign language translator. It uses Random Forest and BiLSTM models to interpret real-time hand gestures, translating them into spoken words or text via an Arduino smart glove and a React web interface.
An assistive technology system enabling mute individuals to communicate through hand gestures. Uses flex sensors with Arduino to capture finger movements, KNN machine learning for real-time gesture classification, and text-to-speech for voice output. Supports 11 predefined phrases with expandable gesture library.
A real-time embedded system for intelligent sitting posture detection and monitoring using ESP32, machine learning, and multimodal sensor fusion to promote ergonomic health and prevent musculoskeletal disorders.