ACM CHI 2022

The CHI conference has been and gone, though I’m pleased to share that I had one full paper at CHI this year. Investigating Clutching Interactions for Touchless Medical Imaging Systems was a collaboration with Trinity College Dublin where we looked at clutching methods for touchless gesture systems, within the context of touchless medical imaging systems. For more information about what clutching is and why it’s important, you can read this post on touchless clutching techniques.

    Investigating Clutching Interactions for Touchless Medical Imaging Systems
    S. Cronin, E. Freeman, and G. Doherty.
    In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 2022.

ACM ICMI 2021

I am very happy to share that I’ve had five papers accepted at the ACM International Conference on Multimodal Interaction. These papers cover a variety of recent research projects, from ultrasound haptic perception to new haptic authentication techniques.

HapticLock

This paper describes HapticLock, a novel eyes-free authentication method for smartphones. It uses touchscreen swipe gestures to select PIN digits, with Morse Code vibrations given as feedback. This allows eyes-free PIN entry and mitigate against observational and thermal attacks. This paper was the outcome from an excellent undergraduate student research project.

    HapticLock: Eyes-Free Authentication for Mobile Devices
    G. Dhandapani, J. Ferguson, and E. Freeman.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 195-202. 2021.

Ultrasound Haptic Perception

Two of my papers describe experiments about ultrasound haptic perception. One is about the perception of motion arising from spatially modulated circular patterns. The other is about ultrasound haptics with parametric audio effects. The latter paper shows that white noise sound effects from an ultrasound haptics device can increase the perceived roughness of a normal circular haptic pattern. It also shows that lower rendering frequencies may be perceived as rougher than higher frequencies.

    Enhancing Ultrasound Haptics with Parametric Audio Effects
    E. Freeman.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 692-696. 2021.

    Perception of Ultrasound Haptic Focal Point Motion
    E. Freeman and G. Wilson.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 697-701. 2021.

Calming Haptics for Social Situations

This paper, led by a PhD student at University of Glasgow, describes a qualitative investigation into user preferences for calming haptic stimuli. This is part of a broader project looking at how haptics can potentially be used to present calming and reassuring stimuli in anxiety inducing social situations.

    User Preferences for Calming Affective Haptic Stimuli in Social Settings
    S. A. Macdonald, E. Freeman, S. Brewster, and F. Pollick.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 387-396. 2021.

Polarity of Audio/Vibrotactile Encodings

This paper describes a study investigating response to audio and vibrotactile patterns under varying levels of cognitive load. We looked at opposing encoding polarity, to see if this affected reaction time and interpretation accuracy, challenging the idea that the most ‘intuitive’ polarity will yield the fastest and most accurate responses.

    Investigating the Effect of Polarity in Auditory and Vibrotactile Displays Under Cognitive Load
    J. Ferguson, E. Freeman, and S. Brewster.
    In Proceedings of 23rd ACM International Conference on Multimodal Interaction – ICMI ’21, 379-386. 2021.

Pervasive Displays 2019

ACM Pervasive Displays conference logo. A blue square with the text "PERDIS", with two abstract representations of displays.

I’m at ACM Pervasive Displays this week to present a full paper from the Levitate project. My paper is about using levitating particles as actuated display components for static physical objects. You can read more about this here.

In the paper, I look at how levitating particles can be used as secondary display elements to augment physical objects. Levitating particles can be used as dynamic cursors to annotate physical objects (for example: accompanying an audio narration to indicate features of a museum exhibit). They can be used as user representations in an interactive system, adding interactivity to static objects. Actuated particles can also be used as animated display elements, bringing lifeless objects to life with dynamic display elements.

CHI 2019

CHI 2019 logo

I was at CHI 2019 earlier this week. It was the biggest CHI so far (almost 3,900 attendees), so I’m extra proud to have been part of the organising committee – especially since it was in Glasgow! Aside from organisation, I was helping with University of Glasgow’s exhibitor booth, had two Interactivity exhibits about acoustic levitation, and chaired a great session on Touch and Haptics. I didn’t get to see many of the technical sessions, but a few stuck in mind.

There were a couple of really good papers in the first alt.chi session. First, an analysis of dichotomous inference in CHI papers, followed by a first look at trends and clichés in CHI paper writing. Both papers were well presented and were a chance to reflect on how we present our science as a community. I’m moving away from dichotomous statistics but am a bit apprehensive about how reviewers will respond to that style. Papers like this provide a bit more momentum for change which we’ll all benefit from.

I liked Aakar Gupta’s talk on RotoSwype, which used an IMU embedded in a ring for swipe keyboard input in XR. The neat thing about that work was the focus on subtle, low-effort interaction, with hands by the side of the body instead of raised in front. Fatigue is a big barrier for mid-air interaction, especially for prolonged interactions like text entry, so it was nice to see attention paid to that.

There were good papers in the Touch and Haptics session I chaired, but one that especially sticks in mind was Philip Quinn’s work on touchscreen input sensing using a barometric pressure sensor. The core idea was that devices are sealed to prevent water and dust ingress, and also contain barometric pressure sensors for accurate altitude measurements; when someone applies pressure to the touchscreen, the air pressure inside the almost-completely-sealed device changes briefly. This internal pressure change reliably correlates with pressure input on the touchscreen. Our group in Glasgow did a lot of foundational work on pressure input for mobile devices, so it’s cool to see steps towards facilitating this without needing dedicated sensors.

Pervasive Displays 2019 Paper

Pleased to announce I’ve had a full paper accepted by ACM Pervasive Displays 2019! The paper, titled “Enhancing Physical Objects with Actuated Levitating Particles”, is about using acoustic levitation to add actuated display elements to ordinary physical objects. This is a means of adding interactivity and dynamic output to otherwise static, non-interactive objects. See this page for more about this paper.

SICSA DemoFest 2018

Earlier this week I was at SICSA DemoFest in Edinburgh, talking about acoustic levitation and some of the work we’ve been doing on the Levitate project.

For more information about Levitate and the awesome work we’ve been doing, follow us on Twitter (@LevitateProj), check out the project website, and see more photos and videos here.

Want to make your own acoustic levitator? You can build a simpler version of our device using Asier Marzo’s instructables instructions.

Image used in my demo presentation. The image illustrates what a standing wave looks like and has a brief explanation about how acoustic levitation works: small objects can be levitated between high-amplitude areas of the standing sound wave and objects can be moved in mid-air by moving the sound waves.
A close-up photo of our demo booth at the event. We demonstrated a levitation device with a polystyrene bead being moved in a variety of patterns in mid-air.

CHI 2018

CHI 2018 conference logo

I’m going to be at CHI in Montreal next week, to present my full paper, titled “Point-and-Shake: Selecting from Levitating Object Displays”. I’m the last talk in the Input: Targets and Selection session (Thursday 26th April, 9am, Room 517C). Come along to hear about interaction with levitating objects! To find out more, read more about the Levitate project and Point-and-Shake.

I’m also participating in the Mid-Air Haptics for Control Interfaces workshop, run by Ultrahaptics. In the workshop, I’m co-chairing a session with Seokhee Jeon from Kyung Hee University, focusing on the perception of mid-air haptics.

Finally, I’m also going to be chairing the Typing & Touch 2 papers session (Thursday 26th April, 2pm, Room 514B), which has four interesting papers on touchscreen interaction and haptic feedback.

CHI 2018 Paper

I’ve had a paper accepted to the CHI 2018 conference, describing recent work on the Levitate project. The paper is about Point-and-Shake, a mid-air interaction technique for selecting levitating objects. I’m looking forward to presenting this work at Montreal in April! The follow 30-second preview gives a super quick demo of our interaction technique.

For more information about the Levitate project, check us out on Twitter: @LevitateProj

CHI 2017 Paper + Videos

I’m happy to note that I’ve had a full paper [1] accepted to CHI 2017. The paper describes research from the ABBI project, about how sound from wearable and fixed sources can be used to help visually impaired children at school (for more, please see here). The videos in this post include a short description of the paper as well as a longer description of the research and our findings.

[1] Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently
E. Freeman, G. Wilson, S. Brewster, G. Baud-Bovy, C. Magnusson, and H. Caltenco.
In Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems – CHI ’17, 4146-4157. 2017.