Ultrasound based Bi manual Gesture Detection using MVDR Beamforming and Machine Learning

Loading...
Thumbnail Image

Authors

Radford, Nick

Date

Type

thesis

Language

eng

Keyword

gesture recognition , human computer interaction , HCI , MVDR , beamforming , machine learning , bi manual , two handed , ultrasound , Match Filter , linear frequency modulated chirp , MEMS microphone , LRCN , convLSTM , gesture , gesture detection

Research Projects

Organizational Units

Journal Issue

Alternative Title

Abstract

This thesis presents the development and validation of an ultrasound based bi-manual gesture detection system using beamforming and machine learning. Hardware consisting of an array of eight MEMS microphones and an ultrasonic transducer was designed and built. Software was developed to convert the hardware output into depth images of the user's gesturing hands. The depth images from multiple users were used to train a neural network, which permitted gesture classification of both one handed and bi-manual gestures. Average gesture classification accuracy was 93\% when trained with a single user and 82\% when trained with multiple users.

Description

Citation

Publisher

License

Queen's University's Thesis/Dissertation Non-Exclusive License for Deposit to QSpace and Library and Archives Canada
ProQuest PhD and Master's Theses International Dissemination Agreement
Intellectual Property Guidelines at Queen's University
Copying and Preserving Your Thesis
This publication is made available by the authority of the copyright owner solely for the purpose of private study and research and may not be copied or reproduced except as permitted by the copyright laws without written authority from the copyright owner.
Attribution-NonCommercial-ShareAlike 3.0 United States

Journal

Volume

Issue

PubMed ID

External DOI

ISSN

EISSN