ICMI ’17 Paper & ISS ’17 Demo

I’ve had a paper accepted by ACM ICMI 2017 titled “Rhythmic Micro-Gestures: Discreet Interaction On-the-Go” [1]. The paper is about rhythmic micro-gestures, a new interaction technique for interacting with mobile devices. This technique combines rhythmic gestures, an input technique from my CHI 2016 paper, with the concept of micro-gestures, small hand movements that can be performed discreetly. I’ll be giving a talk about this paper at the conference in November, in Glasgow.

We’ve also had a demo accepted by ACM ISS 2017 from the Levitate project [2]. That demo gives attendees the chance to try interacting with mid-air objects, suspended in air by acoustic levitation.

[1] Rhythmic Micro-Gestures: Discreet Interaction On-the-Go
E. Freeman, G. Griffiths, and S. Brewster.
In Proceedings of ACM International Conference on Multimodal Interaction – ICMI ’17, to appear. 2017.

[Bibtex]

@inproceedings{ICMI2017,
    author = {Freeman, Euan and Griffiths, Gareth and Brewster, Stephen},
    booktitle = {{Proceedings of ACM International Conference on Multimodal Interaction - ICMI '17}},
    title = {{Rhythmic Micro-Gestures: Discreet Interaction On-the-Go}},
    year = {2017},
    publisher = {ACM Press},
    pages = {to appear},
    doi = {},
  url = {},
}

[2] Floating Widgets: Interaction with Acoustically-Levitated Widgets
E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, to appear. 2017.

 DOI      [Bibtex]

@inproceedings{ISS2017Demo,
    author = {Freeman, Euan and Anderson, Ross and Andersson, Carl and Williamson, Julie and Brewster, Stephen},
    booktitle = {{Proceedings of ACM International Conference on Interactive Surfaces and Spaces - ISS '17 Demos}},
    title = {{Floating Widgets: Interaction with Acoustically-Levitated Widgets}},
    year = {2017},
    publisher = {ACM Press},
    pages = {to appear},
    doi = {10.1145/3132272.3132294},
  url = {},
}

ABBI Demo at ICMI ’16

Earlier this month I was in Tokyo for the International Conference on Multimodal Interaction (ICMI). I was there to demo research from the ABBI project. We had two ABBI demos from the Multimodal Interaction Group at the conference: mine demonstrated how ABBI could be used to adapt the lighting at home for visually impaired children, and Graham’s was about using non-visual stimulus (e.g., thermal, vibration) to present affective cues in a more accessible way for visually impaired smartphone users.

The conference was good and it was held in an amazing city – Tokyo. I spent a lot of down time playing with my camera; you can see some photos by clicking on the image below.

Tokyo 2016

Next year ICMI visits another amazing city – Glasgow! Julie and Alessandro from the Glasgow Interactive Systems Group will be hosting the conference here at Glasgow Uni.

ICMI ’14 Highlights

Last week I was in Istanbul for ICMI ’14, the International Conference on Multimodal Interaction. ICMI is where signal processing and machine learning meets human-computer interaction, with aims of finding ways to use and improve multimodal interaction.

Ask two people and you’ll get a different definition of “multimodal interaction“. From my (HCI) perspective, it is interaction with technology using a variety of human capabilities; such as our perceptual abilities (like seeing, hearing, feeling) and motor control abilities (like speaking, gesturing, touching). In one of this year’s keynotes, Yvonne Rogers said we should design multimodal interfaces because we also experience the world using many modalities.

In this post I’m going to recap what I thought were the most interesting papers at the conference this year. There are also some photos of the sights, because why not?

Topkapi Palace
Topkapi Palace

Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations

by Radu-Daniel Vatavu, Lisa Anthony and Jacob O. Wobbrock

Vatavu et al. presented a poster on Gesture Heatmaps, which are visualisations of how users perform touch-stroke gestures. Their visualisations represent characteristics of how users perform gestures, such as stroke speed and distance error from a gesture template. These visualisations can be used to summarise gesture performances, giving insight into how users perform touch gestures. These could be used to identify problematic gestures or understand which parts of gestures users find difficult, for example. Something which I liked about this paper was the way they used these visualisations to create confusion matrices, showing where and why gestures were misclassified.

Blue Mosque / Sultanahmet
Blue Mosque / Sultanahmet

CrossMotion: Fusing Device and Image Motion for User Identification, Tracking and Device Association

by Andrew D. Wilson and Hrvoje Benko

Wilson and Benko found that device acceleration (from accelerometers) was highly correlated with image acceleration (from a Kinect, in this case). This means that fusing acceleration data from these two sources can be used to identify a particular person in an image, even if their mobile device isn’t visible (for example, phone in pocket). Some advantages of using this approach are that users can be found in an image from their device movement alone (simplifying identification) and devices can be identified and tracked, even without direct line of sight.

Galata Tower
Galata Tower

SoundFLEX: Designing Audio to Guide Interactions with Shape-Retaining Deformable Interfaces

by Koray Tahiroğlu, Thomas Svedström, Valtteri Wikström, Simon Overstall, Johan Kildal and Teemu Ahmaniemi

Tahiroğlu et al. looked at how audio cues could be used to guide interactions with a deformable interface. They found that sound was an effective way of encouraging users to deform devices and some of their designs were particularly effective for guiding users to specific deformations. Based on these findings, they recommend using sound to help users discover deformations. Koray had a cool demo at the conference, which is the first time I’ve tried a deformable device prototype. Pretty neat idea.

ICMI ’14 Paper Accepted

My full paper, “Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions”, was accepted to ICMI 2014. It was also accepted for oral presentation rather than poster presentation, so I’m looking forward to that!

Tactile Feedback for Above-Device Interaction.
Tactile Feedback for Above-Device Interaction.

In this paper we looked at tactile feedback for above-device interaction with a mobile phone. We compared direct tactile feedback to distal tactile feedback from wearables (rings, smart-watches) and ultrasound haptic feedback. We also looked at different feedback designs and investigated the impact of tactile feedback on performance, workload and preference.

ultrasound array
Array of Ultrasound Transducers for Ultrasound Haptic Feedback.

We found that tactile feedback had no impact on input performance but did improve workload significantly (making it easier to interact). Users also significantly preferred tactile feedback to no tactile feedback. More details are in the paper [1] along with design recommendations for above- and around-device interface designers. I’ve written a bit more about this project here.

Video

The following video (including awful typo on the last scene!) shows the two gestures we used in these studies.

References

[1] Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions
E. Freeman, S. Brewster, and V. Lantz.
In Proceedings of the International Conference on Multimodal Interaction – ICMI ’14, pp. 419-426. 2014.

 PDF       DOI       Website       Video      [Bibtex]

@inproceedings{ICMI2014,
    author = {Freeman, Euan and Brewster, Stephen and Lantz, Vuokko},
    booktitle = {Proceedings of the International Conference on Multimodal Interaction - ICMI '14},
    pages = {419--426},
    publisher = {ACM Press},
    title = {{Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions}},
    pdf = {http://research.euanfreeman.co.uk/papers/ICMI_2014.pdf},
    doi = {10.1145/2663204.2663280},
    year = {2014},
    url = {http://euanfreeman.co.uk/projects/above-device-tactile-feedback/},
    video = {{https://www.youtube.com/watch?v=K1TdnNBUFoc}},
}