Rhythmic Gestures

Introduction

A key usability challenge for touchless gesture systems is being able to infer user intent to interact. Mid-air gesture sensing is ‘always on’ and will often detect input from people who have no intention of interacting with the touchless user interface. Incorrectly treating this data as input can lead to false-positive gesture recognition, which may lead to disruptive actions by people who had no intention of providing input.

Rhythmic gestures are touchless gestures that users repeat in a rhythmic manner, as a means of showing their intent to interact. This can help to reduce false-positive gesture recognition, as people are unlikely to perform the exact gesture in a repetitive manner, in time with the input rhythm. This is a novel form of touchless input based on spatial and temporal coincidence, rather than just using spatial information for recognition.

As an example, a rhythmic gesture could require users to wave their hand from side-to-side in time with a repeating stimulus.

I first investigated rhythmic gestures during my PhD thesis, for the purpose of allowing users to direct their input towards a particular device. These were evaluated through user studies described in a CHI 2016 paper [1], which investigated different spatial patterns (e.g., linear vs circular trajectories) and rhythm tempos. Results were promising. At a similar time, work on motion correlation (inspired by gaze-based orbit interactions) started to appear, showing the potential of temporal coincidence for touchless gesture input.

In later work at ICMI 2017 [2], we considered rhythmic micro-gestures by applying the same interaction principles to micro-movements of the hand and wrist, versus the larger hand and arm movements used in the CHI 2016 paper. The goal here was to evaluate more subtle hand movements, to use spatio-temporal coincidence for more discreet and socially acceptable input.

Research highlights:

References

[1] Do That, There: An Interaction Technique for Addressing In-Air Gesture Systems
E. Freeman, S. Brewster, and V. Lantz.
In Proceedings of the 34th Annual ACM Conference on Human Factors in Computing Systems – CHI ’16, 2319-2331. 2016.

[2] Rhythmic Micro-Gestures: Discreet Interaction On-the-Go
E. Freeman, G. Griffiths, and S. Brewster.
In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17, 115-119. 2017.

ICMI ’17 Paper & ISS ’17 Demo

I’ve had a paper accepted by ACM ICMI 2017 titled “Rhythmic Micro-Gestures: Discreet Interaction On-the-Go” [1]. The paper is about rhythmic micro-gestures, a new interaction technique for interacting with mobile devices. This technique combines rhythmic gestures, an input technique from my CHI 2016 paper, with the concept of micro-gestures, small hand movements that can be performed discreetly. I’ll be giving a talk about this paper at the conference in November, in Glasgow.

We’ve also had a demo accepted by ACM ISS 2017 from the Levitate project [2]. That demo gives attendees the chance to try interacting with mid-air objects, suspended in air by acoustic levitation.

[1] Rhythmic Micro-Gestures: Discreet Interaction On-the-Go
E. Freeman, G. Griffiths, and S. Brewster.
In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17, 115-119. 2017.

[2] Floating Widgets: Interaction with Acoustically-Levitated Widgets
E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, 417-420. 2017.