I am very happy to share that I’ve had five papers accepted at the ACM International Conference on Multimodal Interaction. These papers cover a variety of recent research projects, from ultrasound haptic perception to new haptic authentication techniques.


This paper describes HapticLock, a novel eyes-free authentication method for smartphones. It uses touchscreen swipe gestures to select PIN digits, with Morse Code vibrations given as feedback. This allows eyes-free PIN entry and mitigate against observational and thermal attacks. This paper was the outcome from an excellent undergraduate student research project.

    HapticLock: Eyes-Free Authentication for Mobile Devices
    G. Dhandapani, J. Ferguson, and E. Freeman.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 195-202. 2021.

Ultrasound Haptic Perception

Two of my papers describe experiments about ultrasound haptic perception. One is about the perception of motion arising from spatially modulated circular patterns. The other is about ultrasound haptics with parametric audio effects. The latter paper shows that white noise sound effects from an ultrasound haptics device can increase the perceived roughness of a normal circular haptic pattern. It also shows that lower rendering frequencies may be perceived as rougher than higher frequencies.

    Enhancing Ultrasound Haptics with Parametric Audio Effects
    E. Freeman.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 692-696. 2021.

    Perception of Ultrasound Haptic Focal Point Motion
    E. Freeman and G. Wilson.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 697-701. 2021.

Calming Haptics for Social Situations

This paper, led by a PhD student at University of Glasgow, describes a qualitative investigation into user preferences for calming haptic stimuli. This is part of a broader project looking at how haptics can potentially be used to present calming and reassuring stimuli in anxiety inducing social situations.

    User Preferences for Calming Affective Haptic Stimuli in Social Settings
    S. A. Macdonald, E. Freeman, S. Brewster, and F. Pollick.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 387-396. 2021.

Polarity of Audio/Vibrotactile Encodings

This paper describes a study investigating response to audio and vibrotactile patterns under varying levels of cognitive load. We looked at opposing encoding polarity, to see if this affected reaction time and interpretation accuracy, challenging the idea that the most ‘intuitive’ polarity will yield the fastest and most accurate responses.

    Investigating the Effect of Polarity in Auditory and Vibrotactile Displays Under Cognitive Load
    J. Ferguson, E. Freeman, and S. Brewster.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 379-386. 2021.

IEEE World Haptics 2019

Last week I was at the IEEE World Haptics Conference in Tokyo, Japan. I was presenting my full paper on HaptiGlow, a new system and feedback technique for helping users find a good hand position for mid-air interaction.

Photo of the HaptiGlow system. An Ultrahaptics UHEV1 device with a strip of LEDs around the front edge and left and right sides. The LEDs are green, indicating that the user has their hand in a good position.

I brought a demo to give during the interactive presentation session, which seemed to go well. The demo was self-motivating: most people instinctively approach mid-air haptic devices in a poor way, so the demo session immediately highlighted the need for feedback that helps users improve their hand position.

Unsurprisingly, the person who needed the feedback the least was Hiroyuki Shinoda, whose lab has done some of the most important work on ultrasound haptic feedback. For most other attendees, however, I think this demo was a poignant way of showing the need for more work that helps users understand how to get the most out of these devices.

Some thoughts about the rest of the conference: there was a huge presence from Facebook Reality Labs, so it’ll be interesting to see how large scale industry involvement shapes the next couple of years of haptics research; wrist-based haptics seemed a popular topic, especially squeezing the wrist; the variety of haptic devices for VR continues to grow, including haptic shoes; rich passive haptics and material properties are clearly important to industry, a complement to the dynamic digital haptics that tend to dominate the conference proceedings; finally, there are lots of technology-focused contributions and lots of perception-focused contributions, why are these sub-communities not working together as much as they could be?

Pervasive Displays 2019

ACM Pervasive Displays conference logo. A blue square with the text "PERDIS", with two abstract representations of displays.

I’m at ACM Pervasive Displays this week to present a full paper from the Levitate project. My paper is about using levitating particles as actuated display components for static physical objects. You can read more about this here.

In the paper, I look at how levitating particles can be used as secondary display elements to augment physical objects. Levitating particles can be used as dynamic cursors to annotate physical objects (for example: accompanying an audio narration to indicate features of a museum exhibit). They can be used as user representations in an interactive system, adding interactivity to static objects. Actuated particles can also be used as animated display elements, bringing lifeless objects to life with dynamic display elements.

CHI 2019

CHI 2019 logo

I was at CHI 2019 earlier this week. It was the biggest CHI so far (almost 3,900 attendees), so I’m extra proud to have been part of the organising committee – especially since it was in Glasgow! Aside from organisation, I was helping with University of Glasgow’s exhibitor booth, had two Interactivity exhibits about acoustic levitation, and chaired a great session on Touch and Haptics. I didn’t get to see many of the technical sessions, but a few stuck in mind.

There were a couple of really good papers in the first alt.chi session. First, an analysis of dichotomous inference in CHI papers, followed by a first look at trends and clichés in CHI paper writing. Both papers were well presented and were a chance to reflect on how we present our science as a community. I’m moving away from dichotomous statistics but am a bit apprehensive about how reviewers will respond to that style. Papers like this provide a bit more momentum for change which we’ll all benefit from.

I liked Aakar Gupta’s talk on RotoSwype, which used an IMU embedded in a ring for swipe keyboard input in XR. The neat thing about that work was the focus on subtle, low-effort interaction, with hands by the side of the body instead of raised in front. Fatigue is a big barrier for mid-air interaction, especially for prolonged interactions like text entry, so it was nice to see attention paid to that.

There were good papers in the Touch and Haptics session I chaired, but one that especially sticks in mind was Philip Quinn’s work on touchscreen input sensing using a barometric pressure sensor. The core idea was that devices are sealed to prevent water and dust ingress, and also contain barometric pressure sensors for accurate altitude measurements; when someone applies pressure to the touchscreen, the air pressure inside the almost-completely-sealed device changes briefly. This internal pressure change reliably correlates with pressure input on the touchscreen. Our group in Glasgow did a lot of foundational work on pressure input for mobile devices, so it’s cool to see steps towards facilitating this without needing dedicated sensors.

Pervasive Displays 2019 Paper

Pleased to announce I’ve had a full paper accepted by ACM Pervasive Displays 2019! The paper, titled “Enhancing Physical Objects with Actuated Levitating Particles”, is about using acoustic levitation to add actuated display elements to ordinary physical objects. This is a means of adding interactivity and dynamic output to otherwise static, non-interactive objects. See this page for more about this paper.

SICSA DemoFest 2018

Earlier this week I was at SICSA DemoFest in Edinburgh, talking about acoustic levitation and some of the work we’ve been doing on the Levitate project.

For more information about Levitate and the awesome work we’ve been doing, follow us on Twitter (@LevitateProj), check out the project website, and see more photos and videos here.

Want to make your own acoustic levitator? You can build a simpler version of our device using Asier Marzo’s instructables instructions.

Image used in my demo presentation. The image illustrates what a standing wave looks like and has a brief explanation about how acoustic levitation works: small objects can be levitated between high-amplitude areas of the standing sound wave and objects can be moved in mid-air by moving the sound waves.
A close-up photo of our demo booth at the event. We demonstrated a levitation device with a polystyrene bead being moved in a variety of patterns in mid-air.

CHI 2018

CHI 2018 conference logo

I’m going to be at CHI in Montreal next week, to present my full paper, titled “Point-and-Shake: Selecting from Levitating Object Displays”. I’m the last talk in the Input: Targets and Selection session (Thursday 26th April, 9am, Room 517C). Come along to hear about interaction with levitating objects! To find out more, read more about the Levitate project and Point-and-Shake.

I’m also participating in the Mid-Air Haptics for Control Interfaces workshop, run by Ultrahaptics. In the workshop, I’m co-chairing a session with Seokhee Jeon from Kyung Hee University, focusing on the perception of mid-air haptics.

Finally, I’m also going to be chairing the Typing & Touch 2 papers session (Thursday 26th April, 2pm, Room 514B), which has four interesting papers on touchscreen interaction and haptic feedback.

CHI 2018 Paper

I’ve had a paper accepted to the CHI 2018 conference, describing recent work on the Levitate project. The paper is about Point-and-Shake, a mid-air interaction technique for selecting levitating objects. I’m looking forward to presenting this work at Montreal in April! The follow 30-second preview gives a super quick demo of our interaction technique.

For more information about the Levitate project, check us out on Twitter: @LevitateProj

ICMI ’17 Paper & ISS ’17 Demo

I’ve had a paper accepted by ACM ICMI 2017 titled “Rhythmic Micro-Gestures: Discreet Interaction On-the-Go” [1]. The paper is about rhythmic micro-gestures, a new interaction technique for interacting with mobile devices. This technique combines rhythmic gestures, an input technique from my CHI 2016 paper, with the concept of micro-gestures, small hand movements that can be performed discreetly. I’ll be giving a talk about this paper at the conference in November, in Glasgow.

We’ve also had a demo accepted by ACM ISS 2017 from the Levitate project [2]. That demo gives attendees the chance to try interacting with mid-air objects, suspended in air by acoustic levitation.

[1] Rhythmic Micro-Gestures: Discreet Interaction On-the-Go
E. Freeman, G. Griffiths, and S. Brewster.
In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17, 115-119. 2017.

[2] Floating Widgets: Interaction with Acoustically-Levitated Widgets
E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, 417-420. 2017.