ACM ICMI 2021

I am very happy to share that I’ve had five papers accepted at the ACM International Conference on Multimodal Interaction. These papers cover a variety of recent research projects, from ultrasound haptic perception to new haptic authentication techniques.

HapticLock

This paper describes HapticLock, a novel eyes-free authentication method for smartphones. It uses touchscreen swipe gestures to select PIN digits, with Morse Code vibrations given as feedback. This allows eyes-free PIN entry and mitigate against observational and thermal attacks. This paper was the outcome from an excellent undergraduate student research project.

    HapticLock: Eyes-Free Authentication for Mobile Devices
    G. Dhandapani, J. Ferguson, and E. Freeman.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 195-202. 2021.

Ultrasound Haptic Perception

Two of my papers describe experiments about ultrasound haptic perception. One is about the perception of motion arising from spatially modulated circular patterns. The other is about ultrasound haptics with parametric audio effects. The latter paper shows that white noise sound effects from an ultrasound haptics device can increase the perceived roughness of a normal circular haptic pattern. It also shows that lower rendering frequencies may be perceived as rougher than higher frequencies.

    Enhancing Ultrasound Haptics with Parametric Audio Effects
    E. Freeman.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 692-696. 2021.

    Perception of Ultrasound Haptic Focal Point Motion
    E. Freeman and G. Wilson.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 697-701. 2021.

Calming Haptics for Social Situations

This paper, led by a PhD student at University of Glasgow, describes a qualitative investigation into user preferences for calming haptic stimuli. This is part of a broader project looking at how haptics can potentially be used to present calming and reassuring stimuli in anxiety inducing social situations.

    User Preferences for Calming Affective Haptic Stimuli in Social Settings
    S. A. Macdonald, E. Freeman, S. Brewster, and F. Pollick.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 387-396. 2021.

Polarity of Audio/Vibrotactile Encodings

This paper describes a study investigating response to audio and vibrotactile patterns under varying levels of cognitive load. We looked at opposing encoding polarity, to see if this affected reaction time and interpretation accuracy, challenging the idea that the most ‘intuitive’ polarity will yield the fastest and most accurate responses.

    Investigating the Effect of Polarity in Auditory and Vibrotactile Displays Under Cognitive Load
    J. Ferguson, E. Freeman, and S. Brewster.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 379-386. 2021.

Enhancing Ultrasound Haptics with Parametric Audio Effects

Overview

Ultrasound haptic devices can create parametric audio as well as contactless haptic feedback. In a paper at the 2021 ACM International Conference on Multimodal Interaction, I investigated if multimodal output from these devices can influence the perception of haptic feedback. I used a magnitude estimation experiment to evaluate perceived roughness of an ultrasound haptic pattern.

We rendered a circular haptic pattern using a focal point moving at one of three frequencies (a): 50, 70, 90 Hz. This was accompanied by a parametric audio effect (b): noise, tone, or none. Participants moved their hand back and forth across the haptic pattern then rated how rough it felt.

Results suggest that white noise audio from the haptics device increased perceived roughness and pure tones did not, and that lower rendering frequencies may increase perceived roughness.

Scatterplot showing the mean roughness estimates with 95% confidence intervals. X-axis shows rendering frequency with three levels: 50 Hz, 70 Hz, 90 Hz. Y-axis shows normalised roughness estimates on a scale from 0 to 1. The two key trends in the figure are that roughness estimates are higher for the white noise audio condition, and higher for the 50 Hz rendering frequency.

Our results show that multimodal output has the potential to expand the range of sensations that can be presented by an ultrasound haptic device, paving the way to richer mid-air haptic interfaces.

    Enhancing Ultrasound Haptics with Parametric Audio Effects
    E. Freeman.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 692-696. 2021.

Acknowledgements

This research has received funding from the 🇪🇺 European Union’s Horizon 2020 research and innovation programme under grant agreement #737087. This work was completed as part of the Levitate project.

Perception of Ultrasound Haptic Focal Point Motion

Overview

Ultrasound haptic patterns can be rendered by continuously moving an ultrasonic focal point. It is not known how this focal point motion affects haptic perception. In a paper at the 2021 ACM International Conference on Multimodal Interaction, we describe two psychophysical experiments investigating the perception of an ultrasound haptic focal point moving along a circular path.

Mid-air haptic patterns can be created by rapidly moving ultrasonic focal points, e.g., along a circular path. In this work, we investigated how such motion is perceived.

Our first experiment finds that a sensation of motion is perceived at speeds up to 17 revolutions per second (17 Hz rendering frequency), similar to the so-called ‘flutter’ sensation associated with low frequency vibrations and movements.

Plot showing the mean threshold render frequencies with 95% confidence intervals. The x-axis shows circle diameter, from 4 to 7 centimetres. The y-axis shows focal point render frequency, from 0 to 18 revolutions per second. The plot shows a mean of approximately 17 revolutions per second for all circle sizes.

Our second experiment found a mostly linear relationship between movement speed and perceived intensity up to this speed.

Plot showing mean intensity estimates for both sized circles with 95% confidence intervals. The x-axis shows rendering frequency, from 5 to 19 revolutions per second. The y-axis shows normalised intensity estimates, from 0 to 1. Plot shows that magnitude estimates increase with frequency, and there are higher magnitude estimates for the larger circle.

Haptic circles are widely used in ultrasound haptic interfaces: e.g., for spherical virtual objects or to give feedback about mid-air gestures. Our results can inform the design of ultrasound haptic interfaces, so that designers can create or avoid the sensation of tactile motion. Motion may be desirable for dynamic feedback: e.g., using below the 17 revolutions per second threshold to create moving patterns to indicate changing values or to accompany animated visual icons. Conversely, designers may wish to emphasise the contiguous outline of a virtual shape by rendering significantly above 17 revolutions per second. Since perceived intensity scales with circle size and rendering frequency, our results can also be used to create perceptually similar haptic objects: i.e., balancing size and frequency to yield similar intensity.

    Perception of Ultrasound Haptic Focal Point Motion
    E. Freeman and G. Wilson.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 697-701. 2021.

Acknowledgements

This research has received funding from the 🇪🇺 European Union’s Horizon 2020 research and innovation programme under grant agreement #737087. This work was completed as part of the Levitate project.

University of Glasgow logo.
University of Strathclyde logo.

HapticLock: Eyes-Free Authentication for Mobile Devices

HapticLock uses non-visual interaction modalities for discreet eyes-free PIN entry. Users select PIN digits by swiping up or down (a), with Morse Code vibration patterns (b) for feedback about the currently selected digit. Users confirm selection with a double tap (c), to move to the next digit, continuing until the PIN is complete.

Overview

Smartphones provide access to increasing amounts of personal and sensitive information, yet are often only secured using methods that are prone to observational attacks. In a paper at the 2021 ACM International Conference on Multimodal Interaction, we present HapticLock, a novel haptic-only authentication method for mobile devices that uses non-visual interaction modalities for discreet PIN entry that is difficult to attack by shoulder surfing.

HapticLock touchscreen gestures: (a) swipe up or down to increase or decrease digit, respectively; (b) double tap to confirm digit; (c) two-finger tap to remove most recent digit; (d) long-press to check how many digits are entered.

We evaluated HapticLock in two studies. First, a usability experiment (N=20) finds that HapticLock enables effective PIN entry in secure conditions: e.g., in 23.5s with 98.3% success rate for a four-digit PIN entered from a random start digit. Second, a shoulder surfing experiment (N=15) finds that HapticLock is highly resistant to observational attacks. Even when interaction is highly visible, attackers need to guess the first digit when PIN entry begins with a random number, yielding a very low success rate for shoulder surfing. Furthermore, a device can be hidden from view during authentication.

Our use of haptic interaction modalities gives privacy-conscious mobile device users a usable and secure authentication alternative for sensitive situations. HapticLock is slower than normal PIN entry via touchscreen keyboard, which makes it unsuitable for high frequency usage (e.g., each time a smartphone needs unlocked). Our intention was to explore a secure alternative for privacy-conscious users who are accessing sensitive information, for infrequent but high-risk transactions, or authenticating in the presence of others. The benefits of eyes-free PIN entry are a worthy trade-off in such scenarios.

This work is described in a full paper at the 2021 ACM International Conference on Multimodal Interaction. This project was carried out by Gloria, one of my undergraduate students in the 2020-2021 academic year.

    HapticLock: Eyes-Free Authentication for Mobile Devices
    G. Dhandapani, J. Ferguson, and E. Freeman.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 195-202. 2021.

Video

ICMI ’17 Paper & ISS ’17 Demo

I’ve had a paper accepted by ACM ICMI 2017 titled “Rhythmic Micro-Gestures: Discreet Interaction On-the-Go” [1]. The paper is about rhythmic micro-gestures, a new interaction technique for interacting with mobile devices. This technique combines rhythmic gestures, an input technique from my CHI 2016 paper, with the concept of micro-gestures, small hand movements that can be performed discreetly. I’ll be giving a talk about this paper at the conference in November, in Glasgow.

We’ve also had a demo accepted by ACM ISS 2017 from the Levitate project [2]. That demo gives attendees the chance to try interacting with mid-air objects, suspended in air by acoustic levitation.

[1] Rhythmic Micro-Gestures: Discreet Interaction On-the-Go
E. Freeman, G. Griffiths, and S. Brewster.
In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17, 115-119. 2017.

[2] Floating Widgets: Interaction with Acoustically-Levitated Widgets
E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, 417-420. 2017.

ABBI Demo at ICMI ’16

Earlier this month I was in Tokyo for the International Conference on Multimodal Interaction (ICMI). I was there to demo research from the ABBI project. We had two ABBI demos from the Multimodal Interaction Group at the conference: mine demonstrated how ABBI could be used to adapt the lighting at home for visually impaired children, and Graham’s was about using non-visual stimulus (e.g., thermal, vibration) to present affective cues in a more accessible way for visually impaired smartphone users.

The conference was good and it was held in an amazing city – Tokyo. Next year, ICMI visits another amazing city – Glasgow! Julie and Alessandro from the Glasgow Interactive Systems Group will be hosting the conference here at Glasgow Uni.

ICMI ’14 Highlights

Last week I was in Istanbul for ICMI ’14, the International Conference on Multimodal Interaction. ICMI is where signal processing and machine learning meets human-computer interaction, with aims of finding ways to use and improve multimodal interaction.

Ask two people and you’ll get a different definition of “multimodal interaction“. From my (HCI) perspective, it is interaction with technology using a variety of human capabilities; such as our perceptual abilities (like seeing, hearing, feeling) and motor control abilities (like speaking, gesturing, touching). In one of this year’s keynotes, Yvonne Rogers said we should design multimodal interfaces because we also experience the world using many modalities.

In this post I’m going to recap what I thought were the most interesting papers at the conference this year. There are also some photos of the sights, because why not?

Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations

by Radu-Daniel Vatavu, Lisa Anthony and Jacob O. Wobbrock

Vatavu et al. presented a poster on Gesture Heatmaps, which are visualisations of how users perform touch-stroke gestures. Their visualisations represent characteristics of how users perform gestures, such as stroke speed and distance error from a gesture template. These visualisations can be used to summarise gesture performances, giving insight into how users perform touch gestures. These could be used to identify problematic gestures or understand which parts of gestures users find difficult, for example. Something which I liked about this paper was the way they used these visualisations to create confusion matrices, showing where and why gestures were misclassified.

CrossMotion: Fusing Device and Image Motion for User Identification, Tracking and Device Association

by Andrew D. Wilson and Hrvoje Benko

Wilson and Benko found that device acceleration (from accelerometers) was highly correlated with image acceleration (from a Kinect, in this case). This means that fusing acceleration data from these two sources can be used to identify a particular person in an image, even if their mobile device isn’t visible (for example, phone in pocket). Some advantages of using this approach are that users can be found in an image from their device movement alone (simplifying identification) and devices can be identified and tracked, even without direct line of sight.

SoundFLEX: Designing Audio to Guide Interactions with Shape-Retaining Deformable Interfaces

by Koray Tahiroğlu, Thomas Svedström, Valtteri Wikström, Simon Overstall, Johan Kildal and Teemu Ahmaniemi

Tahiroğlu et al. looked at how audio cues could be used to guide interactions with a deformable interface. They found that sound was an effective way of encouraging users to deform devices and some of their designs were particularly effective for guiding users to specific deformations. Based on these findings, they recommend using sound to help users discover deformations. Koray had a cool demo at the conference, which is the first time I’ve tried a deformable device prototype. Pretty neat idea.

ICMI ’14 Paper Accepted

My full paper, “Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions”, was accepted to ICMI 2014. It was also accepted for oral presentation rather than poster presentation, so I’m looking forward to that!

Tactile Feedback for Above-Device Interaction.
Tactile Feedback for Above-Device Interaction.

In this paper we looked at tactile feedback for above-device interaction with a mobile phone. We compared direct tactile feedback to distal tactile feedback from wearables (rings, smart-watches) and ultrasound haptic feedback. We also looked at different feedback designs and investigated the impact of tactile feedback on performance, workload and preference.

ultrasound array
Array of Ultrasound Transducers for Ultrasound Haptic Feedback.

We found that tactile feedback had no impact on input performance but did improve workload significantly (making it easier to interact). Users also significantly preferred tactile feedback to no tactile feedback. More details are in the paper [1] along with design recommendations for above- and around-device interface designers. I’ve written a bit more about this project here.

Video

The following video (including awful typo on the last scene!) shows the two gestures we used in these studies.

References

[1] Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions
E. Freeman, S. Brewster, and V. Lantz.
In Proceedings of the International Conference on Multimodal Interaction – ICMI ’14, 419-426. 2014.