Gestures

Hand gestures. Photo by Charles Haynes: CC BY-SA.
Hand gestures. Photo by Charles Haynes: CC BY-SA.

I want to make gesture interaction – interacting with computers through hand movements in mid-air – easier and more enjoyable to use. My PhD research focused on helping users address gesture systems, especially when those systems only have limited capabilities for providing feedback. I’ve written a lot about gestures and problems with gesture interaction, so this page brings that information together to give an overview of why gestures are difficult and how we might make them better.

Addressing Gesture Systems

When users want to interact with an in-air gesture system, they must first address it. This involves finding where to perform gestures, so that they can be sensed, and finding out how to direct input towards the system you want to interact with. During my PhD, I developed and evaluated interaction techniques for addressing in-air gesture systems. You can read more about this here. A related challenge is finding where to put your hands for mid-air interfaces, especially when mid-air haptics are used. I developed a system called HaptiGlow that helped users find a good hand position for mid-air input.

Above- and Around-Device Interaction

My research often looks at gestures in close proximity to small devices, either above (for example, gesturing over a phone on a table) or around (for example, gesturing behind a device you are holding with your other hand) those devices. I give an introduction to around-device interaction here, present some research and guidelines for above-device interaction with phones here, and discuss our work on above-device tactile feedback here. I also explain why we would want to use these types of gestures here.

Gestures Are Not “Natural”

In this post I outline three gesture interaction problems (the Midas Touch problem, the address problem and the sensing problem) and what implications these have for gesture interaction. In short, we should not think of gestures as being “natural” because there are many practical issues we must overcome to make them usable.

Novel Gesture Feedback

My PhD research looks at how we can move feedback about gestures off the screen and into the space around devices instead. I’ve written about tactile feedback for gestures here. I’ve also written about interactive light feedback, a novel type of display, for gestures here.

Gestures In Multimodal Interaction

Here I talk about two papers from 2014 where gestures are considered as part of multimodal interactions. While this idea was notably demonstrated in the 1980s, it still hasn’t reached mainstream computing. Perhaps this is about to change with new technologies.