Emotion Aware Tactile Feedback for Deaf Users in Live Acoustic Environments
Biomedical Engineering
Prajwal Mohan
Abstract
Individuals with profound hearing impairment often face difficulty experiencing the emotional richness of live acoustic environments such as concerts. Although assistive technologies improve communication, they frequently fail to reproduce the subtle, emotional characteristics of sound essential for a deeply immersive experience. Current haptic feedback devices typically convert sound into vibrations using a unidirectional transformation of sound parameters, ignoring the user’s real-time emotional state. This limitation creates a static, impersonalized experience with sound and the person doesn’t feel the dynamic relationship between the sound stimulus and their individual emotional response. This project is designed to address this gap by developing and evaluating a closed-loop biofeedback mechanism in a wearable haptic system. By integrating a Muse EEG headband with the Music: Not Impossible (MINI) haptic vest, we developed a real-time feedback system that modulates vibrotactile intensity based on EEG-derived emotional valence. The system utilizes a custom Python Max/MSP pipeline that extracts specific neurophysiological features from the Muse headband to calculate an adaptive gain. The results of functional verification indicate that the system operates in a closed loop, dynamically adapting tactile feedback within safe, smoothed gain limits. Crucially, the system maintains the rhythmic integrity of the audio while incorporating variations according to the listener’s affect by boosting intensity during states of negative valence (lower FAA) and softening during positive states. This new approach to bioadaptive technology demonstrates that it can convert the haptic feedback from a passive recipient into a dynamic, fully active, and personalized cycle of haptic feedback, aiming to significantly enhance emotional engagement and immersion for deaf users in live musical settings.