OnTheGo

Gesture watch




Gestural input on a wrist worn smartwatch.

PROJECT DETAILS



About This Project

As mobile device screens continue to get smaller (smartwatches, head-mounted devices like Google Glass), touch-based interaction with them becomes harder. With OnTheGo, our goal is to compliment touch and voice based input on these devices by allowing interactions through in-air gestures around the devices. Gestural interactions are not only intuitive for certain situations where touch may be cumbersome like running, skiing, or cooking, but are also convenient for things like quick application and task management, certain types of navigation and interaction, and simple inputs to applications.

Project Data

Misha Sra and Matthew Chang.
Gesture input, smartwatch



© Human-AI Integration Lab