This project got approved and published on the official website of James Dyson Awards: https://www.jamesdysonaward.org/2025/project/chatterpalm This project is a wearable communication aid designed for individuals with speech or hearing impairments. It uses flex sensors embedded in a glove to detect hand gestures, which are interpreted by an Arduino Uno and translated into meaningful phrases like:
- “I want food”
- “I want water”
- “I want to use the washroom”
- “I'm sleepy”
The translated phrases are then:
- Displayed on an OLED screen for visual feedback
- Vocalized using a text-to-speech engine on a connected computer (via Python and pyttsx3) Tech Stack
- Hardware: Arduino Uno, flex sensors, resistors, OLED display (SH1106), wearable glove
- Software: Arduino IDE (C++), Python (serial + TTS)
- Libraries: SH1106.h, pyttsx3, serial
Goal To bridge the communication gap for non-verbal users through an intuitive, real-time, low-cost gesture translation system.
-imo its really helpful for oral cancer,stroke,paralysed patients