Ultrasound based Bi manual Gesture Detection using MVDR Beamforming and Machine Learning
gesture recognition , human computer interaction , HCI , MVDR , beamforming , machine learning , bi manual , two handed , ultrasound , Match Filter , linear frequency modulated chirp , MEMS microphone , LRCN , convLSTM , gesture , gesture detection
This thesis presents the development and validation of an ultrasound based bi-manual gesture detection system using beamforming and machine learning. Hardware consisting of an array of eight MEMS microphones and an ultrasonic transducer was designed and built. Software was developed to convert the hardware output into depth images of the user's gesturing hands. The depth images from multiple users were used to train a neural network, which permitted gesture classification of both one handed and bi-manual gestures. Average gesture classification accuracy was 93\% when trained with a single user and 82\% when trained with multiple users.