Mobile HCI ’14: Why would I use around-device gestures?

Toronto is a fantastic city, which has made this conference so enjoyable.
Toronto is a fantastic city, which has made this conference so enjoyable.

At the Mobile HCI poster session I had some fantastic discussions with some great people. There’s been a lot of around-device interaction research presented at the conference this week and a lot of people who I spoke to when presenting my poster asked: why would I want to do this?

That’s a very important question and the reason it gets asked can maybe give some insight into when around-device gestures may and may not be useful. A lot of people said that if they were already holding their phone, they would just use the touchscreen to provide input. Others said they would raise the device to their mouth for speech input or would even use the device itself for performing a gesture (e.g. shaking it).

In our poster and its accompanying paper, we focused on above-device gestures. We focus on a particular area of the around-device space – directly over the device – as we think this is where users are mostly likely to benefit from using gestures. People typically keep their phones on flat surfaces – Pohl et al. found this in their around-device device paper [link], Wiese et al. [link] found that in their CHI ’13 study, and Dey et al. [link] found that three years ago. As such, gestures are very likely to be used over a phone.

Enjoying some local pilsner to wrap up the conference!
Enjoying some local pilsner to wrap up the conference!

So, why would we want to gesture over our phones? My favourite example, and one which really seems to resonate with people, is using gestures to read recipes while cooking in the kitchen. Wet and messy hands, the risks of food contamination, the need for multitasking – these are all inherent parts of preparing food which can motivate using gestures to interact with mobile devices. Gestures would let me move through recipes on my phone while cooking, without having to first wash my hands. Gestures would let me answer calls while I multitask in the kitchen, without having to stop what I’m doing. Gestures would let me dismiss interruptions while I wash the dishes afterwards, without having to dry my hands.

This is just one scenario where we envisage above-device gestures being useful. Gestures are attractive for a variety of reasons in this context: touch input is inconvenient (I need to wash my hands first); touch input requires more engagement (I need to stop what I’m doing to focus); and touch input is unavailable (I need to dry my hands). I think the answer to why we would want to use these gestures is that they let us interact when other input is inconvenient. Our phones are nearby on surfaces so let’s interact with them while they’re there.

In summary, our work focuses on gestures above the device as this is where we see them being most commonly used. There are many reasons people would want to use around-device gestures but we think the most compelling ones motivate using above-device gestures.

Mobile HCI ’14: “Are you comfortable doing that?”

OCAD University, who are one of the Mobile HCI '14 hosts, have some fantastic architecture on campus.
OCAD University, who are one of the Mobile HCI ’14 hosts, have some fantastic architecture on campus.

One of my favourite talks from the third day of Mobile HCI ’14 was Ahlstrom et al.’s paper on the social acceptability of around-device gestures [link]. In short: they asked users if they were comfortable doing around-device gestures. I think this is a timely topic because we’re now seeing around-device interfaces added to commercial smartphones. Samsung’s Galaxy S4 had hover gestures over the display and Google’s Project Tango added depth sensors to the smartphone form factor. I feel that now we’ve established ways of detecting around-device gestures, it’s now time to look at what around-device gestures should be and if users are willing to use them.

In Ahlstrom’s paper, which was presented excellently by Pourang Irani, they did three studies looking at different aspects of the social acceptability of around-device gestures. They looked mainly at aspects of gesture mechanics: gesture size, gesture duration, position relative to device, distance from the device. When asking users if they were comfortable doing gestures, they found that users were most happy to gesture near the device (biased towards the side of their dominant hand) and found shorter interactions more acceptable.

They also looked at how spectators perceived these gestures, by opportunistically asking onlookers what they thought of someone who was using gestures nearby. What surprised me was that spectators found around-device gestures more acceptable in a wider variety of social situations than the users from the first studies. Does seeing other people perform gestures make those types of gesture input seem more acceptable?

Tonight I presented my poster [paper link] on our design studies for above-device gesture design. There were some similarities between our work and Ahlstrom’s; purely by coincidence, we both asked users if they were comfortable and willing to use certain gestures. However, we focused on what the gestures were, whereas they focused on other aspects of gesturing (e.g. gesture duration).

In our poster and paper we present design recommendations for creating around-device interactions which users think are more usable and more acceptable. I think the next big step for around-device research is looking at how to map potential gestures to actions and identifying ways of making around-device input better. My PhD research is focusing on the output side of things, looking at how we can design feedback to help users as they gesture using the space near devices. If you saw my poster tonight or had a chat with me, there’s more about the research in our poster here; tonight was fun so thanks for stopping by!

Mobile HCI ’14: Using Ordinary Surfaces for Interaction

Mobile HCI '14 day one: a wee bit of Toronto and Henning Pohl's idea of around-device devices.
Mobile HCI ’14 day one: a wee bit of Toronto and Henning Pohl’s idea of around-device devices.

Today was the first day of the papers program at Mobile HCI ’14 and amongst the great talks was one I particularly liked on the idea of “around-device devices” by Pohl et al. [link]. I’ve written before about around-device interaction, above-device interaction, and how the space around mobile devices can be used for gesturing. What’s novel about interaction using around-device devices, however, is that interaction in the around-device space is not just limited to free-hand gestures relative to the device. Instead, nearby objects can become potential inputs in the user interface. One of the motivations for using nearby objects for interaction is that mobile devices are very commonly kept on surfaces – tables, desks, kitchen worktops – which are also used for storing objects. In this post title I call these ordinary surfaces to distance this idea from interactive surfaces.

The example Henning Pohl gives in the paper title is “my coffee mug is a volume dial“. I think this example captures the idea of around-device devices well: mugs, being cylindrical objects, afford certain interactions. In this case, turning them around. There’s implicit physical feedback from interacting with a tangible object which could make interaction easier. Also, using nearby objects provides many of the benefits which around-device gestures give: larger interaction space, unoccluded content on the device screen, potential for more expressive input, etc.

Exploring Toronto during the lunch break. Looking out across downtown from the Blue Jay's stadium.
Exploring Toronto during today’s lunch break. Looking out across downtown from the Blue Jays’ stadium.

Another interesting paper from today was about Toffee, by Xiao et al. [link]. Sticking with the around-device interaction theme, they looked at if it would be possible to use piezo actuators to localise taps and knocks on surrounding table surfaces. Like with around-device devices, this was another way of making use of nearby ordinary surfaces for input. They found that taps could be most reliably localised when given using more solid objects, like touch styluses or knuckles. Softer points, like fingertips, were more difficult to localise. Toffee would be ideal for radial input around devices, due to the characteristics of the tap localisation approach.

I like both of these papers because they push the around-device interaction space a little beyond mid-air free-hand gestures, in both cases using ordinary surfaces as part of the interaction. I know this has been done before with interfaces like SideSight and Qian Qin’s Dynamic Ambient Lighting for Mobile Devices, but I think it’s important that others are exploring this space further.

Mobile HCI ’14 Poster

This time next week I’ll be boarding a plane to fly to Toronto for Mobile HCI! I’ll be presenting a poster there on above-device gesture design and I’m also participating in the doctoral consortium. I’ve set up a page to accompany my poster and demonstrate our above-device gestures: see here. My poster is also finished, printed and ready to go!

Mobile HCI '14 Poster
Mobile HCI ’14 Poster

Above-Device Gestures

Contents

What is Above-Device Interaction?
Our User-Designed Gestures
Design Recommendations
Mobile HCI ’14 Poster

What is Above-Device Interaction?

Gesture interfaces let users interact with technology using hand movements and poses. Unlike touch input, gestures can be performed away from devices in the larger space around them. This allows users to provide input without reaching out to touch a device or without picking it up. We call this type of input above-device interaction, as users gesture over devices which are placed on a flat surface, like a desk or table. Above-device gestures may be useful when users are unable to touch a device (when their hands are messy, for example) or when touching a device would be less convenient (when wanting to interact quickly from a distance, for example).

Our research focuses on above-device interaction with mobile devices, such as phones. Most research in this area has focused on sensing gesture interactions. Little is known about how to design above-device gestures which are usable and acceptable to users, which is where our research comes in. We ran two studies to look at above-device gesture design further: we gathered gesture ideas from users in a guessability study and then ran an online survey to evaluate some of these gestures further. You can view this survey here.

The outcomes of these studies are a set of evaluated above-device gestures and design recommendations for designing good above-device interactions. This work was presented at Mobile HCI ’14 as a poster [1].

 

Our User-Designed Gestures

We selected two gestures for each mobile phone task from our first study. Gestures were selected based on popularity (called agreement by others) and consistency. Rather than select based on agreement alone, we wanted gestures which could be combined with other gestures in a coherent way. Agreement alone is not a good way of selecting gestures, as our online evaluation actually found that some of the most popular gestures were not as socially acceptable as their alternatives.

We now describe our gestures and link to videos describing them. See our paper [1] for evaluation results. Click on the gesture names to see a video demonstration.

Check Messages

Swipe: User swipes quickly over the device. Can be from left-to-right or from right-to-left.
Draw Rectangle: User extends their finger and traces a rectangle over the device. Imitates the envelope icon used for messages.

Select Item

Finger Count: User selects from numbered targets by extending their fingers.
Point and Tap: User points over the item to be selected then makes a selection by “tapping” with their finger.

Note: We also used these gestures in [2] (see here for more information).

Move Left and Right

Swipe: User swipes over the device to the left or right.
Flick: User holds their hand over the device and flicks their whole hand to the left or right.

Note: We did not look at any specific mapping of gesture direction to navigation behaviour. This seems to be a controversial subject. If a user flicks their hand to the left, should the content move left (i.e. navigate right) or should the viewport move left (i.e. navigate left)?

Delete Item

Scrunch: User holds their hand over the device then makes a fist, as though scrunching up a piece of paper.
Draw X: User extends their finger and draws a cross symbol, as though scoring something out.

Place Phone Call

Phone Symbol: User makes a telephone symbol with their hand (like the “hang loose” gesture).
Dial: User extends their finger and draws a circle, as though dialling an old rotary telephone.

Dismiss / Close Item

Brush Away: User gestures over the device as though they were brushing something away.
Wave Hand: User waves back and forth over their device, as though waving goodbye.

Answer Incoming Call

Swipe: As above.
Pick Up: User holds their hand over the device then raises it, as though picking up a telephone.

Ignore Incoming Call

Brush Away: As above.
Wave Hand: As above.

Place Call on Hold

One Moment: User extends their index finger and holds that pose, as though signalling “one moment” to someone.
Lower Hand: User lowers their hand with their fingers fully extended, as though holding something down.

End Current Call

Wave Hand: As above.
Place Down: Opposite of “Pick Up”, described above.

Check Calendar / Query

Thumb Out: User extends their thumb and alternates between thumbs up and thumbs down.
Draw ? Symbol: User extends their finger and traces a question mark symbol over the device.

Accept and Reject

Thumb Up and Down: User makes the “thumb up” or “thumb down” gesture.
Draw Tick and Cross: User extends their finger and draws a tick or a cross symbol over the device.

 

Design Recommendations

Give non-visual feedback during interaction

Feedback during gestures is important because it shows users that the interface is responding to their gestures and it helps them gesture effectively. However, above-device gestures take place over a phone so visual feedback will not always be visible. Instead, other modalities (like audio or tactile [2]) should be used.

Make non-visual feedback distinct from notifications

Some participants suggested that they may be confused if feedback during gesture interaction was like feedback used for other mobile phone notifications. Gesture feedback should be distinct from other notification types. Continuous feedback which responds to input would let users know that feedback is being given for their actions.

Emphasise that gestures are directed towards a device

Some participants in our studies were concerned about people thinking they were gesturing at them rather than at a device. Above-device interactions should emphasise gesture target by using the device as a referent for gestures and letting users gesture in close proximity.

Support flexible gesture mechanics

During our guessability study, some participants gestured with whole hand movements whereas others performed the same gestures with one or two fingers. Gestures also varied in size; for example, some participants swiped over a large area and others swiped with subtle movements over the display only. Above-device interfaces should be flexible, letting users gesture in their preferred way using either hand. Social situation may influence gesture mechanics. For example, users in public places may use more subtle versions of gestures than they would at home.

Enable complex gestures with a simple gating gesture

Our participants proposed a variety of gestures, from basic movements with simple sensing requirements, to complex hand poses requiring more sophisticated sensors. Always-on sensing with complex sensors will affect battery. Sensors with low power consumption (like the proximity sensor, for example) could be used to detect a simple gesture which then enables more sophisticated sensors. Holding a hand over the phone or clicking fingers, for example, could start a depth camera which could track the hand in greater detail.

Use simple gestures for casual interactions

Casual interactions (such as checking for notifications) are low-effort and imprecise so should be easy to perform and sense. Easily sensed gestures lower power requirements for input sensing and allow for variance in performance when gesturing imprecisely. Users may also use these gestures more often when around others so allowing variance lets users gesture discreetly, in an acceptable way.

 

Mobile HCI ’14 Poster

Mobile HCI '14 Poster
Mobile HCI ’14 Poster

References

[1] Towards Usable and Acceptable Above-Device Interactions
E. Freeman, S. Brewster, and V. Lantz.
In Mobile HCI ’14 Posters, 459-464. 2014.

[2] Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions
E. Freeman, S. Brewster, and V. Lantz.
In Proceedings of the International Conference on Multimodal Interaction – ICMI ’14, 419-426. 2014.

Acknowledgements

This research was part funded by Nokia Research Centre, Finland. We would also like to thank everyone who participated in our studies.

Posters, SICSA HCI and Second Year Viva

SICSA HCI Lanyard

June has gotten off to an exciting start! I won a poster presentation competition, got a poster paper into Mobile HCI ’14 and scheduled my annual progress review. I’ve arranged my 2nd year viva, which is an annual progress review for my PhD research. Finishing and submitting my report felt a little anticlimactic compared to last year; I suppose the first year review is much more important and by this stage it’s more of a checkpoint. Lately I’ve been writing and research planning a lot so I’m looking forward to getting back to actually doing research. Designing, making, all those fun things that make HCI awesome! Yesterday was the SICSA HCI yearly meetup, which was good fun. This year it was hosted by University of Dundee so travelling to Dundee was nice. I spent a lot of time there as a kid, especially around the university campus, so it was cool to go back there and see how everything has changed. Highlights of the day included keynotes from Miguel Nacenta and David Flatla. Miguel presented some really cool research and David was so damn entertaining! We also had a few posters and a demo from our group. I won the poster presentation competition, which was a nice surprise. My poster (below) gave a general overview of my PhD research and showed off a couple of projects.

SICSA HCI Poster
SICSA HCI ’14 Poster

Winning the poster competition must have been a good omen because I returned home to a notification from Mobile HCI saying my poster was accepted. That paper and poster are about two gesture design studies from the start of my PhD. I’ll post more about them another time.

Mobile HCI Doctoral Consortium

I’ve been accepted for the doctoral consortium at Mobile HCI ’14! I’m looking forward to it – it’ll be great to get a grilling from others and will help with my thesis, something which becomes increasingly daunting knowing that I’m halfway through my three years of PhD research.