Gestures

Hand gestures. Photo by Charles Haynes: CC BY-SA.
Hand gestures. Photo by Charles Haynes: CC BY-SA.

I want to make gesture interaction – interacting with computers through hand movements in mid-air – easier and more enjoyable to use. My PhD research focuses on improving gesture interaction with small devices like phones and wearable computers, although the problems I deal with are not unique to these types of device. I’ve written a lot about gestures and problems with gesture interaction, so this page attempts to bring that information together to give an overview of why gestures are difficult and how we might make them better.

Addressing Gesture Systems

When users want to interact with an in-air gesture system, they must first address it. This involves finding where to perform gestures, so that they can be sensed, and finding out how to direct input towards only the system you want to interact with, so that other systems do not act upon your movements as well. During my PhD, I developed and evaluated interaction techniques for addressing in-air gesture systems. You can read more about this here.

Above- and Around-Device Interaction

My research often looks at gestures in close proximity to devices, which is often above (for example, gesturing over a phone on a table) or around (for example, gesturing behind a device you are holding with your other hand) those devices. I give an introduction to around-device interaction here, present research and guidelines for above-device interaction with phones here, and discuss our work on above-device tactile feedback here. I also explain why we would want to use these types of gestures here.

Gestures Are Not “Natural”

In this post I outline three gesture interaction problems (the Midas Touch problem, the address problem and the sensing problem) and what implications these have for gesture interaction. In short, we should not think of gestures as being “natural” because there are many practical issues we must overcome to make them usable.

Novel Gesture Feedback

My PhD research looks at how we can move feedback about gestures off the screen and into the space around devices instead. I’ve written about tactile feedback for gestures here. I’ve also written about interactive light feedback, a novel type of display, for gestures here.

Gestures In Multimodal Interaction

Here I talk about two papers from 2014 where gestures are considered as part of multimodal interactions. While this idea was notably demonstrated in the 1980s, it still hasn’t reached mainstream computing. Perhaps this is about to change with new technologies.