Hands Free

Mobile gesture recognition




Following up on the previous exploratory project, OnTheGo, we implemented Hands Free, an in-air gesture recognition system.

PROJECT DETAILS



About This Project

Hands Free uses a single RGB camera on an unmodified smart phone as the only sensor. A convolutional neural network classifier combined with a novel pre-processing technique to perform color invariant recognition allows the classification of hand shapes. Using this method we've created several demo applications: A message reader, which allows the user to trigger text to speech reading of incoming messages. A running timer, which allows the user to time operate a stopwatch using gestures only. A music player, which allows the user to use gestures to control music playback. In each of these use cases we found that the system was able to perform equally well in indoor and outdoor scenarios and was robust to the noise caused by a moving camera and hand.

Project Data

Misha Sra and Matthew Chang. MEng Thesis.
Mobile, hand gesture recognition, CNN



© Human-AI Integration Lab